Test Report: Hyperkit_macOS 19349

                    
                      0359be70ee85a493d9f37ccc73e8278336c81275:2024-07-31:35584
                    
                

Test fail (24/227)

x
+
TestOffline (149.72s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-296000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 
aab_offline_test.go:55: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p offline-docker-296000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : exit status 80 (2m24.317907811s)

                                                
                                                
-- stdout --
	* [offline-docker-296000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "offline-docker-296000" primary control-plane node in "offline-docker-296000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "offline-docker-296000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:43:35.825883    5604 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:43:35.826133    5604 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:43:35.826139    5604 out.go:304] Setting ErrFile to fd 2...
	I0731 10:43:35.826143    5604 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:43:35.826331    5604 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:43:35.856724    5604 out.go:298] Setting JSON to false
	I0731 10:43:35.881781    5604 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4385,"bootTime":1722443430,"procs":451,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:43:35.881899    5604 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:43:35.939604    5604 out.go:177] * [offline-docker-296000] minikube v1.33.1 on Darwin 14.5
	I0731 10:43:36.002217    5604 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:43:36.002240    5604 notify.go:220] Checking for updates...
	I0731 10:43:36.046286    5604 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:43:36.067257    5604 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:43:36.088368    5604 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:43:36.109421    5604 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:43:36.130140    5604 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:43:36.152599    5604 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:43:36.181276    5604 out.go:177] * Using the hyperkit driver based on user configuration
	I0731 10:43:36.223563    5604 start.go:297] selected driver: hyperkit
	I0731 10:43:36.223592    5604 start.go:901] validating driver "hyperkit" against <nil>
	I0731 10:43:36.223614    5604 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:43:36.227906    5604 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:43:36.228022    5604 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:43:36.236243    5604 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:43:36.239962    5604 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:43:36.239985    5604 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:43:36.240022    5604 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 10:43:36.240244    5604 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:43:36.240272    5604 cni.go:84] Creating CNI manager for ""
	I0731 10:43:36.240290    5604 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0731 10:43:36.240296    5604 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0731 10:43:36.240361    5604 start.go:340] cluster config:
	{Name:offline-docker-296000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:offline-docker-296000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.loca
l ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: S
SHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:43:36.240446    5604 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:43:36.309432    5604 out.go:177] * Starting "offline-docker-296000" primary control-plane node in "offline-docker-296000" cluster
	I0731 10:43:36.351176    5604 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:43:36.351209    5604 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:43:36.351225    5604 cache.go:56] Caching tarball of preloaded images
	I0731 10:43:36.351338    5604 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:43:36.351347    5604 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:43:36.351628    5604 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/offline-docker-296000/config.json ...
	I0731 10:43:36.351648    5604 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/offline-docker-296000/config.json: {Name:mk9fa7e95129f1245fbf03cb7185bf6103070ba4 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:43:36.351987    5604 start.go:360] acquireMachinesLock for offline-docker-296000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:43:36.352049    5604 start.go:364] duration metric: took 47.622µs to acquireMachinesLock for "offline-docker-296000"
	I0731 10:43:36.352081    5604 start.go:93] Provisioning new machine with config: &{Name:offline-docker-296000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesC
onfig:{KubernetesVersion:v1.30.3 ClusterName:offline-docker-296000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions
:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:43:36.352130    5604 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 10:43:36.373479    5604 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0731 10:43:36.373632    5604 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:43:36.373681    5604 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:43:36.382334    5604 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54068
	I0731 10:43:36.382707    5604 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:43:36.383196    5604 main.go:141] libmachine: Using API Version  1
	I0731 10:43:36.383209    5604 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:43:36.383468    5604 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:43:36.383589    5604 main.go:141] libmachine: (offline-docker-296000) Calling .GetMachineName
	I0731 10:43:36.383683    5604 main.go:141] libmachine: (offline-docker-296000) Calling .DriverName
	I0731 10:43:36.383780    5604 start.go:159] libmachine.API.Create for "offline-docker-296000" (driver="hyperkit")
	I0731 10:43:36.383805    5604 client.go:168] LocalClient.Create starting
	I0731 10:43:36.383839    5604 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 10:43:36.383895    5604 main.go:141] libmachine: Decoding PEM data...
	I0731 10:43:36.383914    5604 main.go:141] libmachine: Parsing certificate...
	I0731 10:43:36.383993    5604 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 10:43:36.384032    5604 main.go:141] libmachine: Decoding PEM data...
	I0731 10:43:36.384044    5604 main.go:141] libmachine: Parsing certificate...
	I0731 10:43:36.384065    5604 main.go:141] libmachine: Running pre-create checks...
	I0731 10:43:36.384073    5604 main.go:141] libmachine: (offline-docker-296000) Calling .PreCreateCheck
	I0731 10:43:36.384147    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:36.384315    5604 main.go:141] libmachine: (offline-docker-296000) Calling .GetConfigRaw
	I0731 10:43:36.394870    5604 main.go:141] libmachine: Creating machine...
	I0731 10:43:36.394900    5604 main.go:141] libmachine: (offline-docker-296000) Calling .Create
	I0731 10:43:36.395120    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:36.395400    5604 main.go:141] libmachine: (offline-docker-296000) DBG | I0731 10:43:36.395106    5625 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:43:36.395518    5604 main.go:141] libmachine: (offline-docker-296000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 10:43:36.855360    5604 main.go:141] libmachine: (offline-docker-296000) DBG | I0731 10:43:36.855300    5625 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/id_rsa...
	I0731 10:43:37.009785    5604 main.go:141] libmachine: (offline-docker-296000) DBG | I0731 10:43:37.009694    5625 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/offline-docker-296000.rawdisk...
	I0731 10:43:37.009798    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Writing magic tar header
	I0731 10:43:37.009808    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Writing SSH key tar header
	I0731 10:43:37.010191    5604 main.go:141] libmachine: (offline-docker-296000) DBG | I0731 10:43:37.010129    5625 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000 ...
	I0731 10:43:37.445769    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:37.445810    5604 main.go:141] libmachine: (offline-docker-296000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/hyperkit.pid
	I0731 10:43:37.445830    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Using UUID db0f9a8b-928d-4f3d-8cf8-523cb6eede97
	I0731 10:43:37.665278    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Generated MAC b2:40:43:6a:92:e
	I0731 10:43:37.665313    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-296000
	I0731 10:43:37.665348    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"db0f9a8b-928d-4f3d-8cf8-523cb6eede97", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001e2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLi
ne:"", process:(*os.Process)(nil)}
	I0731 10:43:37.665382    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"db0f9a8b-928d-4f3d-8cf8-523cb6eede97", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001e2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLi
ne:"", process:(*os.Process)(nil)}
	I0731 10:43:37.665419    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "db0f9a8b-928d-4f3d-8cf8-523cb6eede97", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/offline-docker-296000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/bzimage,
/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-296000"}
	I0731 10:43:37.665518    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U db0f9a8b-928d-4f3d-8cf8-523cb6eede97 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/offline-docker-296000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machi
nes/offline-docker-296000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-296000"
	I0731 10:43:37.665541    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:43:37.669220    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 DEBUG: hyperkit: Pid is 5650
	I0731 10:43:37.669753    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 0
	I0731 10:43:37.669770    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:37.669807    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:37.670756    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:37.670806    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:37.670820    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:37.670852    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:37.670869    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:37.670882    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:37.670909    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:37.670923    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:37.670941    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:37.670953    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:37.670967    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:37.670980    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:37.670991    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:37.671003    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:37.671014    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:37.671042    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:37.671066    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:37.671085    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:37.671104    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:37.671118    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:37.671136    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:37.676648    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:43:37.730639    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:43:37.748387    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:43:37.748422    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:43:37.748437    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:43:37.748469    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:43:38.125632    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:38 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:43:38.125648    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:38 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:43:38.241049    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:43:38.241075    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:43:38.241091    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:43:38.241098    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:43:38.241894    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:38 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:43:38.241908    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:38 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:43:39.671701    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 1
	I0731 10:43:39.671713    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:39.671769    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:39.672677    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:39.672706    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:39.672720    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:39.672728    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:39.672742    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:39.672753    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:39.672760    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:39.672766    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:39.672773    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:39.672780    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:39.672787    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:39.672799    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:39.672812    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:39.672829    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:39.672842    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:39.672851    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:39.672864    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:39.672878    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:39.672889    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:39.672896    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:39.672907    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:41.674623    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 2
	I0731 10:43:41.674639    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:41.674717    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:41.675598    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:41.675661    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:41.675677    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:41.675695    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:41.675718    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:41.675725    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:41.675733    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:41.675741    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:41.675753    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:41.675765    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:41.675778    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:41.675786    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:41.675796    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:41.675805    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:41.675812    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:41.675822    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:41.675829    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:41.675837    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:41.675851    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:41.675861    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:41.675871    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:43.612736    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:43 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:43:43.612866    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:43 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:43:43.612875    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:43 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:43:43.632555    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:43:43 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:43:43.676609    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 3
	I0731 10:43:43.676672    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:43.676793    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:43.678248    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:43.678354    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:43.678380    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:43.678442    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:43.678463    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:43.678482    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:43.678501    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:43.678537    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:43.678585    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:43.678635    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:43.678651    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:43.678663    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:43.678674    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:43.678693    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:43.678705    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:43.678715    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:43.678726    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:43.678735    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:43.678746    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:43.678765    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:43.678777    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:45.679066    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 4
	I0731 10:43:45.679084    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:45.679145    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:45.680000    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:45.680058    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:45.680068    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:45.680090    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:45.680096    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:45.680102    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:45.680127    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:45.680137    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:45.680146    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:45.680158    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:45.680173    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:45.680180    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:45.680188    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:45.680195    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:45.680203    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:45.680218    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:45.680226    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:45.680234    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:45.680242    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:45.680249    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:45.680255    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:47.682245    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 5
	I0731 10:43:47.682257    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:47.682357    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:47.683212    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:47.683266    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:47.683279    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:47.683309    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:47.683322    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:47.683331    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:47.683340    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:47.683350    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:47.683357    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:47.683372    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:47.683387    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:47.683395    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:47.683403    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:47.683411    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:47.683416    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:47.683432    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:47.683445    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:47.683454    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:47.683462    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:47.683469    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:47.683477    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:49.684628    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 6
	I0731 10:43:49.684644    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:49.684702    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:49.685549    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:49.685598    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:49.685608    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:49.685633    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:49.685643    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:49.685665    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:49.685677    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:49.685687    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:49.685694    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:49.685702    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:49.685709    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:49.685720    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:49.685727    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:49.685734    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:49.685743    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:49.685751    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:49.685758    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:49.685766    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:49.685774    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:49.685782    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:49.685790    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:51.686205    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 7
	I0731 10:43:51.686222    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:51.686350    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:51.687175    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:51.687230    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:51.687240    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:51.687250    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:51.687260    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:51.687283    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:51.687295    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:51.687310    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:51.687324    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:51.687335    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:51.687344    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:51.687351    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:51.687356    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:51.687363    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:51.687370    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:51.687388    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:51.687398    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:51.687406    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:51.687414    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:51.687420    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:51.687428    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:53.687932    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 8
	I0731 10:43:53.687949    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:53.688031    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:53.688845    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:53.688878    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:53.688889    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:53.688899    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:53.688914    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:53.688921    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:53.688928    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:53.688937    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:53.688945    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:53.688953    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:53.688965    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:53.688975    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:53.688986    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:53.688997    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:53.689011    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:53.689024    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:53.689042    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:53.689052    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:53.689058    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:53.689066    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:53.689075    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:55.690132    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 9
	I0731 10:43:55.690145    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:55.690229    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:55.691083    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:55.691145    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:55.691156    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:55.691163    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:55.691171    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:55.691180    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:55.691186    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:55.691193    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:55.691200    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:55.691205    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:55.691212    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:55.691228    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:55.691238    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:55.691245    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:55.691255    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:55.691262    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:55.691271    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:55.691283    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:55.691291    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:55.691300    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:55.691308    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:57.692659    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 10
	I0731 10:43:57.692685    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:57.692776    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:57.693735    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:57.693761    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:57.693773    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:57.693782    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:57.693791    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:57.693802    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:57.693811    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:57.693823    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:57.693837    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:57.693845    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:57.693854    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:57.693861    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:57.693868    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:57.693886    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:57.693900    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:57.693908    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:57.693916    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:57.693931    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:57.693943    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:57.693951    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:57.693959    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:43:59.695996    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 11
	I0731 10:43:59.696016    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:43:59.696101    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:43:59.697059    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:43:59.697110    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:43:59.697127    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:43:59.697139    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:43:59.697154    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:43:59.697164    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:43:59.697171    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:43:59.697182    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:43:59.697190    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:43:59.697197    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:43:59.697206    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:43:59.697220    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:43:59.697229    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:43:59.697237    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:43:59.697244    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:43:59.697251    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:43:59.697257    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:43:59.697264    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:43:59.697273    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:43:59.697288    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:43:59.697303    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:01.699330    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 12
	I0731 10:44:01.699346    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:01.699389    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:01.700224    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:01.700241    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:01.700257    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:01.700279    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:01.700292    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:01.700301    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:01.700307    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:01.700314    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:01.700320    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:01.700334    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:01.700346    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:01.700354    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:01.700363    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:01.700370    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:01.700377    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:01.700395    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:01.700407    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:01.700417    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:01.700426    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:01.700437    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:01.700452    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:03.700606    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 13
	I0731 10:44:03.700620    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:03.700699    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:03.701495    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:03.701579    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:03.701593    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:03.701612    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:03.701618    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:03.701625    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:03.701633    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:03.701645    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:03.701653    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:03.701668    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:03.701691    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:03.701717    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:03.701732    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:03.701744    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:03.701752    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:03.701759    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:03.701766    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:03.701779    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:03.701791    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:03.701806    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:03.701814    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:05.702895    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 14
	I0731 10:44:05.702919    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:05.702995    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:05.703798    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:05.703856    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:05.703867    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:05.703878    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:05.703884    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:05.703892    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:05.703897    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:05.703903    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:05.703911    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:05.703918    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:05.703933    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:05.703943    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:05.703952    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:05.703961    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:05.703969    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:05.703976    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:05.703981    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:05.704001    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:05.704010    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:05.704026    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:05.704038    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:07.704342    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 15
	I0731 10:44:07.704356    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:07.704466    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:07.705427    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:07.705482    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:07.705490    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:07.705537    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:07.705551    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:07.705558    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:07.705568    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:07.705583    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:07.705595    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:07.705605    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:07.705612    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:07.705624    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:07.705634    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:07.705641    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:07.705650    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:07.705656    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:07.705663    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:07.705674    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:07.705687    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:07.705704    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:07.705715    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:09.705706    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 16
	I0731 10:44:09.705722    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:09.705817    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:09.706612    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:09.706682    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:09.706693    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:09.706701    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:09.706708    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:09.706716    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:09.706724    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:09.706731    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:09.706739    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:09.706754    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:09.706763    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:09.706770    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:09.706777    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:09.706785    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:09.706792    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:09.706798    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:09.706806    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:09.706813    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:09.706842    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:09.706860    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:09.706871    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:11.707676    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 17
	I0731 10:44:11.707689    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:11.707825    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:11.708688    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:11.708742    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:11.708752    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:11.708759    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:11.708765    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:11.708777    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:11.708791    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:11.708807    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:11.708818    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:11.708834    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:11.708848    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:11.708861    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:11.708870    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:11.708877    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:11.708885    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:11.708905    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:11.708915    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:11.708923    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:11.708931    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:11.708948    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:11.708960    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:13.710989    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 18
	I0731 10:44:13.711004    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:13.711040    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:13.711888    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:13.711934    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:13.711945    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:13.711952    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:13.711968    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:13.711977    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:13.711992    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:13.712015    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:13.712024    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:13.712058    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:13.712071    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:13.712078    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:13.712085    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:13.712091    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:13.712099    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:13.712106    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:13.712114    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:13.712122    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:13.712136    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:13.712151    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:13.712159    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:15.714141    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 19
	I0731 10:44:15.714155    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:15.714227    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:15.715036    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:15.715095    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:15.715107    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:15.715116    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:15.715121    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:15.715147    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:15.715162    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:15.715174    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:15.715195    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:15.715213    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:15.715221    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:15.715228    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:15.715236    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:15.715243    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:15.715256    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:15.715265    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:15.715272    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:15.715278    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:15.715286    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:15.715293    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:15.715304    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:17.717010    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 20
	I0731 10:44:17.717026    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:17.717127    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:17.717985    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:17.718018    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:17.718028    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:17.718039    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:17.718047    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:17.718077    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:17.718088    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:17.718102    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:17.718115    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:17.718133    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:17.718144    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:17.718151    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:17.718160    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:17.718166    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:17.718174    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:17.718184    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:17.718190    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:17.718198    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:17.718206    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:17.718213    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:17.718226    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:19.718262    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 21
	I0731 10:44:19.718274    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:19.718397    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:19.719221    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:19.719264    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:19.719279    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:19.719288    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:19.719297    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:19.719306    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:19.719311    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:19.719319    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:19.719326    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:19.719340    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:19.719353    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:19.719362    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:19.719368    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:19.719383    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:19.719396    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:19.719407    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:19.719422    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:19.719431    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:19.719442    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:19.719449    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:19.719457    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:21.721459    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 22
	I0731 10:44:21.721472    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:21.721551    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:21.722388    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:21.722435    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:21.722451    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:21.722465    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:21.722475    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:21.722481    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:21.722487    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:21.722508    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:21.722520    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:21.722529    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:21.722537    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:21.722543    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:21.722551    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:21.722570    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:21.722582    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:21.722596    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:21.722604    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:21.722611    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:21.722617    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:21.722623    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:21.722631    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:23.724494    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 23
	I0731 10:44:23.724507    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:23.724607    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:23.725517    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:23.725559    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:23.725569    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:23.725577    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:23.725583    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:23.725601    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:23.725621    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:23.725630    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:23.725638    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:23.725645    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:23.725663    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:23.725673    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:23.725682    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:23.725690    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:23.725703    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:23.725712    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:23.725719    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:23.725738    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:23.725750    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:23.725758    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:23.725766    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:25.726036    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 24
	I0731 10:44:25.726051    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:25.726188    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:25.726994    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:25.727056    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:25.727067    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:25.727077    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:25.727090    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:25.727101    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:25.727109    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:25.727117    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:25.727124    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:25.727135    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:25.727151    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:25.727159    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:25.727165    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:25.727178    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:25.727188    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:25.727196    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:25.727209    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:25.727216    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:25.727221    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:25.727228    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:25.727236    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:27.727833    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 25
	I0731 10:44:27.727847    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:27.727982    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:27.728916    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:27.728970    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:27.728984    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:27.728995    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:27.729007    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:27.729027    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:27.729047    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:27.729060    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:27.729070    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:27.729077    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:27.729091    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:27.729099    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:27.729106    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:27.729112    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:27.729118    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:27.729125    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:27.729131    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:27.729137    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:27.729144    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:27.729150    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:27.729156    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:29.730526    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 26
	I0731 10:44:29.730539    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:29.730550    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:29.731371    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:29.731426    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:29.731441    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:29.731451    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:29.731457    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:29.731465    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:29.731487    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:29.731495    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:29.731503    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:29.731521    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:29.731532    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:29.731541    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:29.731548    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:29.731555    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:29.731563    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:29.731571    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:29.731579    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:29.731591    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:29.731602    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:29.731616    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:29.731629    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:31.732880    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 27
	I0731 10:44:31.732893    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:31.733025    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:31.733833    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:31.733882    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:31.733891    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:31.733899    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:31.733909    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:31.733925    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:31.733939    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:31.733956    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:31.733969    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:31.733977    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:31.733989    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:31.733998    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:31.734006    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:31.734014    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:31.734021    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:31.734029    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:31.734036    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:31.734044    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:31.734052    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:31.734058    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:31.734066    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:33.736134    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 28
	I0731 10:44:33.736148    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:33.736189    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:33.737059    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:33.737085    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:33.737100    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:33.737109    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:33.737117    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:33.737126    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:33.737133    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:33.737140    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:33.737149    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:33.737157    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:33.737164    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:33.737173    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:33.737179    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:33.737185    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:33.737193    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:33.737200    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:33.737206    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:33.737219    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:33.737231    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:33.737244    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:33.737254    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:35.739155    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 29
	I0731 10:44:35.739172    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:35.739265    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:35.740143    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for b2:40:43:6a:92:e in /var/db/dhcpd_leases ...
	I0731 10:44:35.740193    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 19 entries in /var/db/dhcpd_leases!
	I0731 10:44:35.740212    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:35.740228    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:35.740239    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:35.740247    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:35.740255    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:35.740266    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:35.740280    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:35.740296    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:35.740309    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:35.740323    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:35.740331    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:35.740338    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:35.740350    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:35.740358    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:35.740366    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:35.740380    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:35.740409    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:35.740443    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:35.740450    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:37.742455    5604 client.go:171] duration metric: took 1m1.357743543s to LocalClient.Create
	I0731 10:44:39.742650    5604 start.go:128] duration metric: took 1m3.389580756s to createHost
	I0731 10:44:39.742663    5604 start.go:83] releasing machines lock for "offline-docker-296000", held for 1m3.389679677s
	W0731 10:44:39.742680    5604 start.go:714] error starting host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for b2:40:43:6a:92:e
	I0731 10:44:39.742980    5604 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:44:39.743008    5604 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:44:39.751937    5604 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54072
	I0731 10:44:39.752291    5604 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:44:39.752640    5604 main.go:141] libmachine: Using API Version  1
	I0731 10:44:39.752653    5604 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:44:39.752902    5604 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:44:39.753300    5604 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:44:39.753330    5604 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:44:39.761710    5604 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54074
	I0731 10:44:39.762051    5604 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:44:39.762397    5604 main.go:141] libmachine: Using API Version  1
	I0731 10:44:39.762415    5604 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:44:39.762649    5604 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:44:39.762778    5604 main.go:141] libmachine: (offline-docker-296000) Calling .GetState
	I0731 10:44:39.762880    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:39.762943    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:39.763919    5604 main.go:141] libmachine: (offline-docker-296000) Calling .DriverName
	I0731 10:44:39.784949    5604 out.go:177] * Deleting "offline-docker-296000" in hyperkit ...
	I0731 10:44:39.806122    5604 main.go:141] libmachine: (offline-docker-296000) Calling .Remove
	I0731 10:44:39.806247    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:39.806256    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:39.806335    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:39.807294    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:39.807345    5604 main.go:141] libmachine: (offline-docker-296000) DBG | waiting for graceful shutdown
	I0731 10:44:40.808746    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:40.808848    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:40.809800    5604 main.go:141] libmachine: (offline-docker-296000) DBG | waiting for graceful shutdown
	I0731 10:44:41.810876    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:41.810961    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:41.812676    5604 main.go:141] libmachine: (offline-docker-296000) DBG | waiting for graceful shutdown
	I0731 10:44:42.814619    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:42.814711    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:42.815397    5604 main.go:141] libmachine: (offline-docker-296000) DBG | waiting for graceful shutdown
	I0731 10:44:43.816491    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:43.816571    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:43.817139    5604 main.go:141] libmachine: (offline-docker-296000) DBG | waiting for graceful shutdown
	I0731 10:44:44.819238    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:44.819352    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5650
	I0731 10:44:44.820712    5604 main.go:141] libmachine: (offline-docker-296000) DBG | sending sigkill
	I0731 10:44:44.820729    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:44.831953    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:44 WARN : hyperkit: failed to read stderr: EOF
	I0731 10:44:44.831979    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:44 WARN : hyperkit: failed to read stdout: EOF
	W0731 10:44:44.850780    5604 out.go:239] ! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for b2:40:43:6a:92:e
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for b2:40:43:6a:92:e
	I0731 10:44:44.850799    5604 start.go:729] Will try again in 5 seconds ...
	I0731 10:44:49.851459    5604 start.go:360] acquireMachinesLock for offline-docker-296000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:44:57.025899    5604 start.go:364] duration metric: took 7.17428234s to acquireMachinesLock for "offline-docker-296000"
	I0731 10:44:57.025973    5604 start.go:93] Provisioning new machine with config: &{Name:offline-docker-296000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesC
onfig:{KubernetesVersion:v1.30.3 ClusterName:offline-docker-296000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions
:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:44:57.026042    5604 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 10:44:57.072500    5604 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0731 10:44:57.072606    5604 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:44:57.072636    5604 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:44:57.081885    5604 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54100
	I0731 10:44:57.082260    5604 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:44:57.082613    5604 main.go:141] libmachine: Using API Version  1
	I0731 10:44:57.082629    5604 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:44:57.082832    5604 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:44:57.082967    5604 main.go:141] libmachine: (offline-docker-296000) Calling .GetMachineName
	I0731 10:44:57.083066    5604 main.go:141] libmachine: (offline-docker-296000) Calling .DriverName
	I0731 10:44:57.083179    5604 start.go:159] libmachine.API.Create for "offline-docker-296000" (driver="hyperkit")
	I0731 10:44:57.083206    5604 client.go:168] LocalClient.Create starting
	I0731 10:44:57.083236    5604 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 10:44:57.083274    5604 main.go:141] libmachine: Decoding PEM data...
	I0731 10:44:57.083283    5604 main.go:141] libmachine: Parsing certificate...
	I0731 10:44:57.083331    5604 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 10:44:57.083359    5604 main.go:141] libmachine: Decoding PEM data...
	I0731 10:44:57.083370    5604 main.go:141] libmachine: Parsing certificate...
	I0731 10:44:57.083383    5604 main.go:141] libmachine: Running pre-create checks...
	I0731 10:44:57.083389    5604 main.go:141] libmachine: (offline-docker-296000) Calling .PreCreateCheck
	I0731 10:44:57.083466    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:57.083484    5604 main.go:141] libmachine: (offline-docker-296000) Calling .GetConfigRaw
	I0731 10:44:57.084131    5604 main.go:141] libmachine: Creating machine...
	I0731 10:44:57.084139    5604 main.go:141] libmachine: (offline-docker-296000) Calling .Create
	I0731 10:44:57.084209    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:57.084339    5604 main.go:141] libmachine: (offline-docker-296000) DBG | I0731 10:44:57.084206    5743 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:44:57.084388    5604 main.go:141] libmachine: (offline-docker-296000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 10:44:57.349336    5604 main.go:141] libmachine: (offline-docker-296000) DBG | I0731 10:44:57.349282    5743 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/id_rsa...
	I0731 10:44:57.415984    5604 main.go:141] libmachine: (offline-docker-296000) DBG | I0731 10:44:57.415938    5743 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/offline-docker-296000.rawdisk...
	I0731 10:44:57.415997    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Writing magic tar header
	I0731 10:44:57.416017    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Writing SSH key tar header
	I0731 10:44:57.416312    5604 main.go:141] libmachine: (offline-docker-296000) DBG | I0731 10:44:57.416281    5743 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000 ...
	I0731 10:44:57.792433    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:57.792455    5604 main.go:141] libmachine: (offline-docker-296000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/hyperkit.pid
	I0731 10:44:57.792512    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Using UUID d141a6f4-6cbc-434b-bb7f-632a98dd7f50
	I0731 10:44:57.817664    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Generated MAC ca:7b:1f:e2:9e:d7
	I0731 10:44:57.817683    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-296000
	I0731 10:44:57.817719    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"d141a6f4-6cbc-434b-bb7f-632a98dd7f50", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001e2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLi
ne:"", process:(*os.Process)(nil)}
	I0731 10:44:57.817748    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"d141a6f4-6cbc-434b-bb7f-632a98dd7f50", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001e2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLi
ne:"", process:(*os.Process)(nil)}
	I0731 10:44:57.817811    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "d141a6f4-6cbc-434b-bb7f-632a98dd7f50", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/offline-docker-296000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/bzimage,
/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-296000"}
	I0731 10:44:57.817851    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U d141a6f4-6cbc-434b-bb7f-632a98dd7f50 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/offline-docker-296000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machi
nes/offline-docker-296000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=offline-docker-296000"
	I0731 10:44:57.817861    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:44:57.820846    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 DEBUG: hyperkit: Pid is 5744
	I0731 10:44:57.821313    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 0
	I0731 10:44:57.821338    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:57.821404    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:44:57.822451    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:44:57.822519    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:44:57.822536    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:44:57.822566    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:57.822594    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:57.822613    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:57.822636    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:57.822648    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:57.822660    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:57.822673    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:57.822684    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:57.822704    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:57.822718    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:57.822732    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:57.822743    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:57.822762    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:57.822783    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:57.822793    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:57.822800    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:57.822807    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:57.822823    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:57.822839    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:44:57.828574    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:44:57.836925    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/offline-docker-296000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:44:57.838027    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:44:57.838050    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:44:57.838063    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:44:57.838075    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:44:58.212960    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:44:58.213014    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:44:58.328805    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:44:58.328845    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:44:58.328885    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:44:58.328905    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:44:58.329664    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:44:58.329673    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:44:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:44:59.823126    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 1
	I0731 10:44:59.823152    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:44:59.823214    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:44:59.824012    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:44:59.824080    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:44:59.824095    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:44:59.824113    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:44:59.824126    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:44:59.824147    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:44:59.824153    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:44:59.824167    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:44:59.824178    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:44:59.824186    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:44:59.824193    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:44:59.824201    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:44:59.824207    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:44:59.824213    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:44:59.824229    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:44:59.824237    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:44:59.824244    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:44:59.824251    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:44:59.824258    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:44:59.824265    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:44:59.824273    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:44:59.824282    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:01.825391    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 2
	I0731 10:45:01.825410    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:01.825500    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:01.826345    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:01.826391    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:01.826402    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:01.826412    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:01.826422    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:01.826428    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:01.826435    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:01.826442    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:01.826447    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:01.826454    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:01.826460    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:01.826467    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:01.826475    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:01.826481    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:01.826487    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:01.826495    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:01.826502    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:01.826507    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:01.826524    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:01.826539    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:01.826557    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:01.826570    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:03.800808    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:45:03 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 10:45:03.800951    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:45:03 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 10:45:03.800961    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:45:03 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 10:45:03.820685    5604 main.go:141] libmachine: (offline-docker-296000) DBG | 2024/07/31 10:45:03 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 10:45:03.827821    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 3
	I0731 10:45:03.827831    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:03.827932    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:03.828766    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:03.828828    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:03.828842    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:03.828861    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:03.828871    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:03.828878    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:03.828885    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:03.828901    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:03.828908    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:03.828915    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:03.828924    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:03.828935    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:03.828946    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:03.828955    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:03.828963    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:03.828971    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:03.828979    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:03.828986    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:03.828993    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:03.829001    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:03.829008    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:03.829016    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:05.830014    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 4
	I0731 10:45:05.830030    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:05.830117    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:05.830947    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:05.830999    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:05.831011    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:05.831033    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:05.831044    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:05.831056    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:05.831063    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:05.831071    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:05.831079    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:05.831087    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:05.831094    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:05.831102    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:05.831119    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:05.831131    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:05.831140    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:05.831148    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:05.831162    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:05.831171    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:05.831180    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:05.831188    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:05.831195    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:05.831205    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:07.832062    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 5
	I0731 10:45:07.832086    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:07.832168    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:07.833028    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:07.833092    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:07.833103    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:07.833121    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:07.833128    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:07.833135    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:07.833142    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:07.833148    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:07.833154    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:07.833161    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:07.833174    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:07.833181    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:07.833188    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:07.833208    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:07.833214    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:07.833221    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:07.833229    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:07.833241    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:07.833253    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:07.833262    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:07.833270    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:07.833279    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:09.834457    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 6
	I0731 10:45:09.834474    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:09.834582    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:09.835420    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:09.835467    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:09.835480    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:09.835489    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:09.835498    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:09.835520    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:09.835540    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:09.835548    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:09.835556    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:09.835564    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:09.835578    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:09.835589    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:09.835611    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:09.835624    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:09.835633    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:09.835641    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:09.835647    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:09.835653    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:09.835659    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:09.835673    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:09.835685    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:09.835695    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:11.837652    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 7
	I0731 10:45:11.837668    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:11.837728    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:11.838559    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:11.838602    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:11.838612    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:11.838621    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:11.838628    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:11.838639    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:11.838645    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:11.838653    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:11.838675    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:11.838688    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:11.838695    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:11.838703    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:11.838711    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:11.838718    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:11.838730    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:11.838739    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:11.838746    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:11.838757    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:11.838765    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:11.838772    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:11.838779    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:11.838796    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:13.839141    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 8
	I0731 10:45:13.839157    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:13.839230    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:13.840062    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:13.840104    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:13.840114    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:13.840123    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:13.840132    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:13.840140    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:13.840147    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:13.840154    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:13.840160    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:13.840166    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:13.840171    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:13.840179    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:13.840187    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:13.840200    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:13.840208    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:13.840221    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:13.840229    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:13.840236    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:13.840243    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:13.840267    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:13.840281    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:13.840299    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:15.841350    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 9
	I0731 10:45:15.841365    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:15.841427    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:15.842315    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:15.842354    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:15.842362    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:15.842370    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:15.842376    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:15.842383    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:15.842389    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:15.842402    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:15.842416    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:15.842432    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:15.842445    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:15.842463    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:15.842472    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:15.842479    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:15.842488    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:15.842495    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:15.842506    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:15.842513    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:15.842522    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:15.842529    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:15.842536    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:15.842544    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:17.844563    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 10
	I0731 10:45:17.844576    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:17.844630    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:17.845695    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:17.845740    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:17.845752    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:17.845771    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:17.845778    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:17.845784    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:17.845792    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:17.845798    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:17.845806    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:17.845813    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:17.845822    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:17.845828    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:17.845835    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:17.845841    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:17.845847    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:17.845868    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:17.845879    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:17.845887    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:17.845896    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:17.845904    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:17.845911    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:17.845920    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:19.847598    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 11
	I0731 10:45:19.847617    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:19.847734    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:19.848591    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:19.848651    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:19.848667    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:19.848677    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:19.848686    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:19.848694    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:19.848718    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:19.848729    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:19.848764    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:19.848778    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:19.848794    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:19.848804    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:19.848816    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:19.848826    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:19.848834    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:19.848842    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:19.848850    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:19.848858    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:19.848866    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:19.848879    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:19.848895    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:19.848908    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:21.849683    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 12
	I0731 10:45:21.849701    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:21.849768    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:21.850660    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:21.850754    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:21.850765    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:21.850775    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:21.850781    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:21.850788    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:21.850801    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:21.850809    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:21.850823    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:21.850831    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:21.850840    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:21.850858    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:21.850866    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:21.850875    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:21.850883    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:21.850889    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:21.850902    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:21.850916    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:21.850932    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:21.850946    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:21.850954    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:21.850963    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:23.851835    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 13
	I0731 10:45:23.851851    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:23.851926    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:23.852731    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:23.852806    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:23.852820    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:23.852838    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:23.852850    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:23.852860    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:23.852871    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:23.852894    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:23.852904    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:23.852912    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:23.852919    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:23.852927    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:23.852933    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:23.852953    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:23.852965    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:23.852974    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:23.852982    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:23.852989    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:23.852999    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:23.853024    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:23.853033    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:23.853042    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:25.853879    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 14
	I0731 10:45:25.853906    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:25.853970    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:25.854895    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:25.854955    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:25.854965    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:25.854983    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:25.855002    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:25.855019    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:25.855027    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:25.855042    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:25.855053    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:25.855061    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:25.855069    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:25.855083    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:25.855094    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:25.855114    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:25.855127    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:25.855135    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:25.855144    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:25.855151    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:25.855159    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:25.855165    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:25.855171    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:25.855177    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:27.856173    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 15
	I0731 10:45:27.856189    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:27.856301    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:27.857113    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:27.857187    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:27.857200    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:27.857207    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:27.857214    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:27.857233    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:27.857240    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:27.857247    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:27.857255    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:27.857277    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:27.857290    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:27.857298    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:27.857306    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:27.857313    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:27.857321    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:27.857328    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:27.857334    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:27.857340    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:27.857346    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:27.857356    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:27.857365    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:27.857373    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:29.858368    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 16
	I0731 10:45:29.858385    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:29.858471    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:29.859284    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:29.859337    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:29.859347    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:29.859363    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:29.859389    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:29.859396    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:29.859406    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:29.859417    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:29.859427    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:29.859433    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:29.859442    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:29.859448    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:29.859454    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:29.859461    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:29.859469    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:29.859477    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:29.859485    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:29.859492    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:29.859500    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:29.859511    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:29.859519    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:29.859535    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:31.860118    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 17
	I0731 10:45:31.860132    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:31.860204    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:31.861008    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:31.861065    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:31.861078    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:31.861095    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:31.861102    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:31.861114    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:31.861122    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:31.861138    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:31.861146    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:31.861152    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:31.861158    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:31.861169    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:31.861181    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:31.861191    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:31.861205    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:31.861216    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:31.861225    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:31.861232    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:31.861239    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:31.861245    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:31.861250    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:31.861256    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:33.863293    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 18
	I0731 10:45:33.863319    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:33.863436    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:33.864269    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:33.864325    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:33.864338    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc991}
	I0731 10:45:33.864347    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:33.864354    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:33.864367    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:33.864379    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:33.864389    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:33.864396    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:33.864404    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:33.864411    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:33.864418    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:33.864426    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:33.864433    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:33.864441    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:33.864457    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:33.864470    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:33.864479    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:33.864489    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:33.864498    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:33.864509    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:33.864518    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:35.866010    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 19
	I0731 10:45:35.866036    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:35.866088    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:35.866907    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:35.866968    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:35.866982    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:35.866990    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:35.866995    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:35.867018    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:35.867030    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:35.867058    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:35.867072    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:35.867080    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:35.867090    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:35.867101    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:35.867114    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:35.867122    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:35.867131    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:35.867146    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:35.867159    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:35.867171    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:35.867177    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:35.867202    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:35.867214    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:35.867230    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:37.868071    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 20
	I0731 10:45:37.868083    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:37.868149    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:37.869025    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:37.869081    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:37.869103    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:37.869136    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:37.869142    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:37.869148    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:37.869154    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:37.869161    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:37.869169    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:37.869175    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:37.869182    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:37.869190    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:37.869195    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:37.869213    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:37.869227    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:37.869234    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:37.869240    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:37.869247    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:37.869255    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:37.869262    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:37.869268    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:37.869275    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:39.870455    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 21
	I0731 10:45:39.870472    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:39.870522    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:39.871430    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:39.871487    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:39.871495    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:39.871507    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:39.871518    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:39.871529    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:39.871545    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:39.871557    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:39.871564    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:39.871575    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:39.871583    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:39.871598    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:39.871611    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:39.871619    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:39.871627    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:39.871635    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:39.871642    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:39.871659    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:39.871672    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:39.871688    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:39.871704    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:39.871714    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:41.873143    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 22
	I0731 10:45:41.873159    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:41.873224    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:41.874048    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:41.874125    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:41.874135    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:41.874143    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:41.874149    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:41.874156    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:41.874164    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:41.874171    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:41.874179    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:41.874188    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:41.874198    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:41.874206    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:41.874213    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:41.874221    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:41.874242    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:41.874257    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:41.874268    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:41.874284    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:41.874293    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:41.874301    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:41.874310    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:41.874318    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:43.874836    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 23
	I0731 10:45:43.874850    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:43.874932    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:43.875765    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:43.875817    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:43.875830    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:43.875848    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:43.875858    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:43.875888    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:43.875905    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:43.875915    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:43.875923    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:43.875931    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:43.875940    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:43.875948    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:43.875955    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:43.875969    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:43.875980    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:43.875988    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:43.875996    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:43.876009    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:43.876023    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:43.876037    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:43.876049    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:43.876070    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:45.876901    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 24
	I0731 10:45:45.876915    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:45.877015    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:45.877852    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:45.877908    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:45.877917    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:45.877925    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:45.877931    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:45.877938    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:45.877944    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:45.877963    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:45.877974    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:45.877987    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:45.877996    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:45.878018    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:45.878028    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:45.878035    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:45.878043    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:45.878049    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:45.878057    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:45.878064    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:45.878069    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:45.878082    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:45.878096    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:45.878106    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:47.878803    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 25
	I0731 10:45:47.878818    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:47.878897    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:47.879736    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:47.879781    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:47.879793    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:47.879803    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:47.879815    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:47.879827    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:47.879840    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:47.879848    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:47.879856    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:47.879871    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:47.879882    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:47.879890    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:47.879898    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:47.879910    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:47.879918    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:47.879926    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:47.879935    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:47.879942    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:47.879950    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:47.879961    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:47.879969    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:47.879977    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:49.881217    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 26
	I0731 10:45:49.881229    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:49.881331    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:49.882179    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:49.882205    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:49.882225    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:49.882244    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:49.882255    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:49.882270    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:49.882279    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:49.882286    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:49.882294    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:49.882301    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:49.882308    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:49.882315    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:49.882323    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:49.882330    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:49.882338    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:49.882344    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:49.882350    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:49.882357    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:49.882364    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:49.882373    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:49.882380    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:49.882389    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:51.884394    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 27
	I0731 10:45:51.884409    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:51.884446    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:51.885466    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:51.885511    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:51.885521    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:51.885530    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:51.885538    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:51.885551    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:51.885574    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:51.885582    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:51.885589    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:51.885596    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:51.885606    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:51.885614    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:51.885620    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:51.885626    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:51.885632    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:51.885639    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:51.885652    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:51.885660    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:51.885667    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:51.885674    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:51.885697    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:51.885709    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:53.886778    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 28
	I0731 10:45:53.886793    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:53.886912    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:53.887725    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:53.887842    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:53.887853    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:53.887862    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:53.887869    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:53.887875    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:53.887882    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:53.887890    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:53.887899    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:53.887906    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:53.887912    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:53.887926    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:53.887936    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:53.887943    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:53.887952    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:53.887959    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:53.887966    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:53.887974    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:53.887982    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:53.887989    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:53.888000    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:53.888009    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:55.888426    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Attempt 29
	I0731 10:45:55.888440    5604 main.go:141] libmachine: (offline-docker-296000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:45:55.888507    5604 main.go:141] libmachine: (offline-docker-296000) DBG | hyperkit pid from json: 5744
	I0731 10:45:55.889398    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Searching for ca:7b:1f:e2:9e:d7 in /var/db/dhcpd_leases ...
	I0731 10:45:55.889445    5604 main.go:141] libmachine: (offline-docker-296000) DBG | Found 20 entries in /var/db/dhcpd_leases!
	I0731 10:45:55.889455    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66aa783e}
	I0731 10:45:55.889470    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:45:55.889476    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:45:55.889483    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:45:55.889490    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:45:55.889498    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:45:55.889506    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:45:55.889513    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:45:55.889520    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:45:55.889530    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:45:55.889538    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:45:55.889544    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:45:55.889551    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:45:55.889569    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:45:55.889582    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:45:55.889591    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:45:55.889598    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:45:55.889605    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:45:55.889610    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:45:55.889619    5604 main.go:141] libmachine: (offline-docker-296000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:45:57.891704    5604 client.go:171] duration metric: took 1m0.807601793s to LocalClient.Create
	I0731 10:45:59.893855    5604 start.go:128] duration metric: took 1m2.866881645s to createHost
	I0731 10:45:59.893945    5604 start.go:83] releasing machines lock for "offline-docker-296000", held for 1m2.867096722s
	W0731 10:45:59.894011    5604 out.go:239] * Failed to start hyperkit VM. Running "minikube delete -p offline-docker-296000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ca:7b:1f:e2:9e:d7
	* Failed to start hyperkit VM. Running "minikube delete -p offline-docker-296000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ca:7b:1f:e2:9e:d7
	I0731 10:45:59.916418    5604 out.go:177] 
	W0731 10:45:59.957278    5604 out.go:239] X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ca:7b:1f:e2:9e:d7
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ca:7b:1f:e2:9e:d7
	W0731 10:45:59.957337    5604 out.go:239] * 
	* 
	W0731 10:45:59.957987    5604 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:46:00.021242    5604 out.go:177] 

                                                
                                                
** /stderr **
aab_offline_test.go:58: out/minikube-darwin-amd64 start -p offline-docker-296000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit  failed: exit status 80
panic.go:626: *** TestOffline FAILED at 2024-07-31 10:46:00.169316 -0700 PDT m=+4005.152858880
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p offline-docker-296000 -n offline-docker-296000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p offline-docker-296000 -n offline-docker-296000: exit status 7 (81.700804ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 10:46:00.248919    5803 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 10:46:00.248943    5803 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "offline-docker-296000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:175: Cleaning up "offline-docker-296000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-296000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-296000: (5.235823638s)
--- FAIL: TestOffline (149.72s)

                                                
                                    
x
+
TestCertOptions (761.12s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-029000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 
cert_options_test.go:49: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p cert-options-029000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : exit status 80 (12m35.445093s)

                                                
                                                
-- stdout --
	* [cert-options-029000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "cert-options-029000" primary control-plane node in "cert-options-029000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "cert-options-029000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for f2:fa:4b:a2:ec:9b
	* Failed to start hyperkit VM. Running "minikube delete -p cert-options-029000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:54:73:d3:46:f4
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:54:73:d3:46:f4
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
cert_options_test.go:51: failed to start minikube with args: "out/minikube-darwin-amd64 start -p cert-options-029000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit " : exit status 80
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-029000 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:60: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p cert-options-029000 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt": exit status 50 (163.840515ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node cert-options-029000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
cert_options_test.go:62: failed to read apiserver cert inside minikube. args "out/minikube-darwin-amd64 -p cert-options-029000 ssh \"openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt\"": exit status 50
cert_options_test.go:69: apiserver cert does not include 127.0.0.1 in SAN.
cert_options_test.go:69: apiserver cert does not include 192.168.15.15 in SAN.
cert_options_test.go:69: apiserver cert does not include localhost in SAN.
cert_options_test.go:69: apiserver cert does not include www.google.com in SAN.
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-029000 config view
cert_options_test.go:93: Kubeconfig apiserver server port incorrect. Output of 
'kubectl config view' = "\n-- stdout --\n\tapiVersion: v1\n\tclusters: null\n\tcontexts: null\n\tcurrent-context: \"\"\n\tkind: Config\n\tpreferences: {}\n\tusers: null\n\n-- /stdout --"
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-029000 -- "sudo cat /etc/kubernetes/admin.conf"
cert_options_test.go:100: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p cert-options-029000 -- "sudo cat /etc/kubernetes/admin.conf": exit status 50 (161.244275ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node cert-options-029000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
cert_options_test.go:102: failed to SSH to minikube with args: "out/minikube-darwin-amd64 ssh -p cert-options-029000 -- \"sudo cat /etc/kubernetes/admin.conf\"" : exit status 50
cert_options_test.go:106: Internal minikube kubeconfig (admin.conf) does not contains the right api port. 
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node cert-options-029000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
cert_options_test.go:109: *** TestCertOptions FAILED at 2024-07-31 11:12:32.330097 -0700 PDT m=+5597.447575931
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p cert-options-029000 -n cert-options-029000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p cert-options-029000 -n cert-options-029000: exit status 7 (76.513693ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 11:12:32.404552    7127 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 11:12:32.404582    7127 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "cert-options-029000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:175: Cleaning up "cert-options-029000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-029000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-029000: (5.231552932s)
--- FAIL: TestCertOptions (761.12s)

                                                
                                    
x
+
TestCertExpiration (1786.77s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-614000 --memory=2048 --cert-expiration=3m --driver=hyperkit 
E0731 10:53:47.095090    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:54:50.237726    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
cert_options_test.go:123: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p cert-expiration-614000 --memory=2048 --cert-expiration=3m --driver=hyperkit : exit status 80 (4m5.739120678s)

                                                
                                                
-- stdout --
	* [cert-expiration-614000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "cert-expiration-614000" primary control-plane node in "cert-expiration-614000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "cert-expiration-614000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 1a:44:a8:3c:19:47
	* Failed to start hyperkit VM. Running "minikube delete -p cert-expiration-614000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 6a:9c:51:81:c6:76
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 6a:9c:51:81:c6:76
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
cert_options_test.go:125: failed to start minikube with args: "out/minikube-darwin-amd64 start -p cert-expiration-614000 --memory=2048 --cert-expiration=3m --driver=hyperkit " : exit status 80
E0731 10:58:19.398903    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:58:27.167831    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-614000 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
E0731 11:01:54.461709    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 11:03:17.517088    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 11:03:19.393532    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 11:03:27.163282    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 11:04:42.442641    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 11:06:54.457289    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 11:08:19.387158    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 11:08:27.155603    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 11:11:30.218357    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 11:11:54.449749    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
cert_options_test.go:131: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p cert-expiration-614000 --memory=2048 --cert-expiration=8760h --driver=hyperkit : exit status 80 (22m35.688293555s)

                                                
                                                
-- stdout --
	* [cert-expiration-614000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "cert-expiration-614000" primary control-plane node in "cert-expiration-614000" cluster
	* Updating the running hyperkit "cert-expiration-614000" VM ...
	* Updating the running hyperkit "cert-expiration-614000" VM ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	* Failed to start hyperkit VM. Running "minikube delete -p cert-expiration-614000" may fix it: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
cert_options_test.go:133: failed to start minikube after cert expiration: "out/minikube-darwin-amd64 start -p cert-expiration-614000 --memory=2048 --cert-expiration=8760h --driver=hyperkit " : exit status 80
cert_options_test.go:136: minikube start output did not warn about expired certs: 
-- stdout --
	* [cert-expiration-614000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "cert-expiration-614000" primary control-plane node in "cert-expiration-614000" cluster
	* Updating the running hyperkit "cert-expiration-614000" VM ...
	* Updating the running hyperkit "cert-expiration-614000" VM ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	* Failed to start hyperkit VM. Running "minikube delete -p cert-expiration-614000" may fix it: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: provision: Temporary Error: error getting ip during provisioning: IP address is not set
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
cert_options_test.go:138: *** TestCertExpiration FAILED at 2024-07-31 11:23:10.540084 -0700 PDT m=+6235.645203614
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p cert-expiration-614000 -n cert-expiration-614000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p cert-expiration-614000 -n cert-expiration-614000: exit status 7 (83.584942ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 11:23:10.621633    7576 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 11:23:10.621656    7576 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "cert-expiration-614000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:175: Cleaning up "cert-expiration-614000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-614000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-614000: (5.256227872s)
--- FAIL: TestCertExpiration (1786.77s)

                                                
                                    
x
+
TestDockerFlags (199.25s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-723000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 
E0731 10:56:54.469169    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
docker_test.go:51: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p docker-flags-723000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : exit status 80 (3m13.524757201s)

                                                
                                                
-- stdout --
	* [docker-flags-723000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "docker-flags-723000" primary control-plane node in "docker-flags-723000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "docker-flags-723000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:56:37.338656    6308 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:56:37.338842    6308 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:56:37.338848    6308 out.go:304] Setting ErrFile to fd 2...
	I0731 10:56:37.338851    6308 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:56:37.339036    6308 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:56:37.340578    6308 out.go:298] Setting JSON to false
	I0731 10:56:37.363229    6308 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":5167,"bootTime":1722443430,"procs":468,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:56:37.363338    6308 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:56:37.384007    6308 out.go:177] * [docker-flags-723000] minikube v1.33.1 on Darwin 14.5
	I0731 10:56:37.427768    6308 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:56:37.427810    6308 notify.go:220] Checking for updates...
	I0731 10:56:37.470436    6308 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:56:37.491645    6308 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:56:37.512645    6308 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:56:37.533432    6308 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:56:37.554621    6308 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:56:37.576078    6308 config.go:182] Loaded profile config "cert-expiration-614000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:56:37.576172    6308 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:56:37.605443    6308 out.go:177] * Using the hyperkit driver based on user configuration
	I0731 10:56:37.647616    6308 start.go:297] selected driver: hyperkit
	I0731 10:56:37.647633    6308 start.go:901] validating driver "hyperkit" against <nil>
	I0731 10:56:37.647643    6308 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:56:37.650661    6308 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:56:37.650772    6308 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:56:37.659079    6308 install.go:137] /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:56:37.662981    6308 install.go:79] stdout: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:56:37.663014    6308 install.go:81] /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit looks good
	I0731 10:56:37.663048    6308 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 10:56:37.663247    6308 start_flags.go:942] Waiting for no components: map[apiserver:false apps_running:false default_sa:false extra:false kubelet:false node_ready:false system_pods:false]
	I0731 10:56:37.663304    6308 cni.go:84] Creating CNI manager for ""
	I0731 10:56:37.663322    6308 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0731 10:56:37.663329    6308 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0731 10:56:37.663399    6308 start.go:340] cluster config:
	{Name:docker-flags-723000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[FOO=BAR BAZ=BAT] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[debug icc=true] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:docker-flags-723000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:
[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:false EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:false apps_running:false default_sa:false extra:false kubelet:false node_ready:false system_pods:false] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientP
ath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:56:37.663482    6308 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:56:37.705611    6308 out.go:177] * Starting "docker-flags-723000" primary control-plane node in "docker-flags-723000" cluster
	I0731 10:56:37.726607    6308 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:56:37.726642    6308 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:56:37.726655    6308 cache.go:56] Caching tarball of preloaded images
	I0731 10:56:37.726764    6308 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:56:37.726774    6308 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:56:37.726855    6308 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/docker-flags-723000/config.json ...
	I0731 10:56:37.726871    6308 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/docker-flags-723000/config.json: {Name:mk8c3b36da666b31d60fa211ccbe9421a83569dd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:56:37.727196    6308 start.go:360] acquireMachinesLock for docker-flags-723000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:57:34.663895    6308 start.go:364] duration metric: took 56.937868742s to acquireMachinesLock for "docker-flags-723000"
	I0731 10:57:34.663934    6308 start.go:93] Provisioning new machine with config: &{Name:docker-flags-723000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[FOO=BAR BAZ=BAT] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[debug icc=true] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSH
Key: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:docker-flags-723000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:false EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:false apps_running:false default_sa:false extra:false kubelet:false node_ready:false system_pods:false] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountI
P: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:57:34.664014    6308 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 10:57:34.685612    6308 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0731 10:57:34.685784    6308 main.go:141] libmachine: Found binary path at /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:57:34.685850    6308 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:57:34.694524    6308 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54745
	I0731 10:57:34.695012    6308 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:57:34.695606    6308 main.go:141] libmachine: Using API Version  1
	I0731 10:57:34.695615    6308 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:57:34.695867    6308 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:57:34.695981    6308 main.go:141] libmachine: (docker-flags-723000) Calling .GetMachineName
	I0731 10:57:34.696069    6308 main.go:141] libmachine: (docker-flags-723000) Calling .DriverName
	I0731 10:57:34.696227    6308 start.go:159] libmachine.API.Create for "docker-flags-723000" (driver="hyperkit")
	I0731 10:57:34.696293    6308 client.go:168] LocalClient.Create starting
	I0731 10:57:34.696347    6308 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 10:57:34.696446    6308 main.go:141] libmachine: Decoding PEM data...
	I0731 10:57:34.696463    6308 main.go:141] libmachine: Parsing certificate...
	I0731 10:57:34.696540    6308 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 10:57:34.696612    6308 main.go:141] libmachine: Decoding PEM data...
	I0731 10:57:34.696624    6308 main.go:141] libmachine: Parsing certificate...
	I0731 10:57:34.696636    6308 main.go:141] libmachine: Running pre-create checks...
	I0731 10:57:34.696647    6308 main.go:141] libmachine: (docker-flags-723000) Calling .PreCreateCheck
	I0731 10:57:34.696735    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:34.696884    6308 main.go:141] libmachine: (docker-flags-723000) Calling .GetConfigRaw
	I0731 10:57:34.727448    6308 main.go:141] libmachine: Creating machine...
	I0731 10:57:34.727479    6308 main.go:141] libmachine: (docker-flags-723000) Calling .Create
	I0731 10:57:34.727580    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:34.727710    6308 main.go:141] libmachine: (docker-flags-723000) DBG | I0731 10:57:34.727564    6328 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:57:34.727810    6308 main.go:141] libmachine: (docker-flags-723000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 10:57:35.036267    6308 main.go:141] libmachine: (docker-flags-723000) DBG | I0731 10:57:35.036177    6328 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/id_rsa...
	I0731 10:57:35.213920    6308 main.go:141] libmachine: (docker-flags-723000) DBG | I0731 10:57:35.213815    6328 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/docker-flags-723000.rawdisk...
	I0731 10:57:35.213947    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Writing magic tar header
	I0731 10:57:35.213971    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Writing SSH key tar header
	I0731 10:57:35.214538    6308 main.go:141] libmachine: (docker-flags-723000) DBG | I0731 10:57:35.214479    6328 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000 ...
	I0731 10:57:35.590633    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:35.590647    6308 main.go:141] libmachine: (docker-flags-723000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/hyperkit.pid
	I0731 10:57:35.590702    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Using UUID 9eaa99f1-510d-4298-8673-c0c5a875fd9f
	I0731 10:57:35.615892    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Generated MAC ea:e9:ee:61:aa:d
	I0731 10:57:35.615908    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-723000
	I0731 10:57:35.615953    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"9eaa99f1-510d-4298-8673-c0c5a875fd9f", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", pro
cess:(*os.Process)(nil)}
	I0731 10:57:35.615993    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"9eaa99f1-510d-4298-8673-c0c5a875fd9f", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", pro
cess:(*os.Process)(nil)}
	I0731 10:57:35.616090    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "9eaa99f1-510d-4298-8673-c0c5a875fd9f", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/docker-flags-723000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/bzimage,/Users/jenkins/m
inikube-integration/19349-1046/.minikube/machines/docker-flags-723000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-723000"}
	I0731 10:57:35.616144    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 9eaa99f1-510d-4298-8673-c0c5a875fd9f -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/docker-flags-723000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags
-723000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-723000"
	I0731 10:57:35.616181    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:57:35.619147    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 DEBUG: hyperkit: Pid is 6329
	I0731 10:57:35.619658    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 0
	I0731 10:57:35.619675    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:35.619780    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:35.620942    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:35.620983    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:35.621023    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:35.621043    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:35.621056    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:35.621074    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:35.621087    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:35.621097    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:35.621110    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:35.621122    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:35.621134    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:35.621153    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:35.621166    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:35.621180    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:35.621195    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:35.621209    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:35.621222    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:35.621236    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:35.621247    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:35.621260    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:35.621275    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:35.621292    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:35.621314    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:35.621331    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:35.621346    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:35.621362    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:35.626822    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:57:35.634880    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:57:35.635703    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:57:35.635734    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:57:35.635748    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:57:35.635766    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:35 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:57:36.009447    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:36 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:57:36.009475    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:36 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:57:36.124052    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:36 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:57:36.124070    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:36 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:57:36.124125    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:36 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:57:36.124148    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:36 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:57:36.124951    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:36 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:57:36.124963    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:36 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:57:37.623280    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 1
	I0731 10:57:37.623296    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:37.623371    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:37.624188    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:37.624260    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:37.624272    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:37.624282    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:37.624294    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:37.624302    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:37.624308    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:37.624321    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:37.624328    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:37.624347    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:37.624356    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:37.624364    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:37.624372    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:37.624379    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:37.624389    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:37.624396    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:37.624404    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:37.624414    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:37.624421    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:37.624427    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:37.624432    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:37.624438    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:37.624455    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:37.624464    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:37.624480    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:37.624489    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:39.625661    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 2
	I0731 10:57:39.625679    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:39.625766    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:39.626785    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:39.626857    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:39.626869    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:39.626881    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:39.626890    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:39.626899    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:39.626906    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:39.626915    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:39.626926    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:39.626936    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:39.626945    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:39.626955    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:39.626964    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:39.626975    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:39.626992    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:39.627005    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:39.627015    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:39.627026    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:39.627033    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:39.627046    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:39.627055    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:39.627070    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:39.627083    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:39.627108    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:39.627118    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:39.627126    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:41.484867    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:41 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 10:57:41.485016    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:41 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 10:57:41.485027    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:41 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 10:57:41.504938    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:57:41 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 10:57:41.627273    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 3
	I0731 10:57:41.627285    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:41.627328    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:41.628153    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:41.628210    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:41.628224    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:41.628236    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:41.628246    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:41.628264    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:41.628276    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:41.628285    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:41.628291    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:41.628300    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:41.628309    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:41.628315    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:41.628322    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:41.628337    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:41.628348    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:41.628359    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:41.628369    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:41.628377    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:41.628384    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:41.628396    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:41.628404    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:41.628411    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:41.628427    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:41.628434    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:41.628441    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:41.628450    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:43.629862    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 4
	I0731 10:57:43.629878    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:43.629976    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:43.630803    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:43.630863    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:43.630871    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:43.630881    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:43.630892    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:43.630901    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:43.630906    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:43.630913    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:43.630924    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:43.630939    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:43.630950    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:43.630956    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:43.630973    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:43.630987    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:43.630996    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:43.631002    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:43.631009    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:43.631017    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:43.631024    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:43.631031    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:43.631039    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:43.631054    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:43.631063    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:43.631070    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:43.631078    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:43.631093    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:45.631491    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 5
	I0731 10:57:45.631539    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:45.631648    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:45.632421    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:45.632476    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:45.632489    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:45.632499    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:45.632507    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:45.632515    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:45.632540    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:45.632554    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:45.632565    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:45.632580    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:45.632598    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:45.632618    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:45.632627    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:45.632639    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:45.632647    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:45.632656    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:45.632669    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:45.632678    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:45.632686    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:45.632694    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:45.632700    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:45.632709    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:45.632719    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:45.632735    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:45.632743    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:45.632750    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:47.634662    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 6
	I0731 10:57:47.634678    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:47.634729    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:47.635531    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:47.635562    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:47.635598    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:47.635614    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:47.635621    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:47.635630    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:47.635641    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:47.635649    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:47.635665    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:47.635679    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:47.635687    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:47.635695    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:47.635703    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:47.635709    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:47.635734    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:47.635748    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:47.635757    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:47.635768    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:47.635777    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:47.635783    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:47.635800    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:47.635812    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:47.635825    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:47.635835    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:47.635857    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:47.635868    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:49.636058    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 7
	I0731 10:57:49.636076    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:49.636125    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:49.636929    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:49.636998    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:49.637010    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:49.637025    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:49.637045    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:49.637072    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:49.637086    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:49.637095    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:49.637104    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:49.637128    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:49.637141    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:49.637154    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:49.637163    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:49.637172    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:49.637177    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:49.637184    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:49.637199    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:49.637210    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:49.637217    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:49.637224    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:49.637230    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:49.637238    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:49.637246    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:49.637254    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:49.637261    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:49.637268    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:51.638187    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 8
	I0731 10:57:51.638203    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:51.638257    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:51.639048    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:51.639095    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:51.639106    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:51.639122    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:51.639128    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:51.639134    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:51.639139    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:51.639145    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:51.639159    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:51.639168    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:51.639186    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:51.639199    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:51.639217    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:51.639225    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:51.639232    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:51.639239    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:51.639246    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:51.639254    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:51.639261    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:51.639269    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:51.639276    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:51.639283    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:51.639290    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:51.639297    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:51.639304    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:51.639312    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:53.640230    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 9
	I0731 10:57:53.640247    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:53.640357    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:53.641125    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:53.641192    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:53.641203    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:53.641212    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:53.641218    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:53.641234    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:53.641253    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:53.641264    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:53.641273    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:53.641294    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:53.641308    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:53.641318    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:53.641325    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:53.641337    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:53.641349    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:53.641358    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:53.641365    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:53.641382    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:53.641394    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:53.641409    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:53.641423    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:53.641433    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:53.641442    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:53.641449    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:53.641455    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:53.641470    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:55.642496    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 10
	I0731 10:57:55.642521    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:55.642607    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:55.643649    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:55.643724    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:55.643738    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:55.643748    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:55.643755    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:55.643761    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:55.643767    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:55.643784    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:55.643791    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:55.643797    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:55.643804    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:55.643826    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:55.643836    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:55.643849    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:55.643859    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:55.643866    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:55.643876    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:55.643887    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:55.643898    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:55.643909    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:55.643921    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:55.643932    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:55.643943    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:55.643951    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:55.643956    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:55.643965    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:57.644551    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 11
	I0731 10:57:57.644565    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:57.644677    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:57.645474    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:57.645519    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:57.645528    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:57.645539    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:57.645546    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:57.645553    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:57.645559    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:57.645595    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:57.645624    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:57.645635    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:57.645647    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:57.645660    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:57.645668    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:57.645676    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:57.645684    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:57.645694    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:57.645701    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:57.645709    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:57.645725    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:57.645738    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:57.645747    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:57.645753    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:57.645761    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:57.645775    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:57.645787    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:57.645796    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:57:59.647662    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 12
	I0731 10:57:59.647684    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:57:59.647725    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:57:59.648598    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:57:59.648625    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:57:59.648650    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:57:59.648660    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:57:59.648669    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:57:59.648676    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:57:59.648682    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:57:59.648688    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:57:59.648694    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:57:59.648707    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:57:59.648715    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:57:59.648721    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:57:59.648741    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:57:59.648753    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:57:59.648762    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:57:59.648771    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:57:59.648778    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:57:59.648785    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:57:59.648791    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:57:59.648810    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:57:59.648818    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:57:59.648827    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:57:59.648838    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:57:59.648846    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:57:59.648851    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:57:59.648858    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:01.649987    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 13
	I0731 10:58:01.650004    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:01.650080    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:01.650896    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:01.650933    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:01.650949    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:01.650961    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:01.650972    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:01.650982    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:01.650994    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:01.651001    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:01.651024    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:01.651034    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:01.651041    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:01.651049    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:01.651066    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:01.651077    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:01.651087    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:01.651094    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:01.651102    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:01.651109    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:01.651116    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:01.651124    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:01.651135    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:01.651141    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:01.651148    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:01.651155    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:01.651163    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:01.651168    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:03.651223    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 14
	I0731 10:58:03.651238    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:03.651299    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:03.652123    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:03.652178    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:03.652190    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:03.652199    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:03.652227    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:03.652238    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:03.652248    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:03.652256    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:03.652269    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:03.652282    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:03.652293    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:03.652300    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:03.652306    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:03.652314    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:03.652328    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:03.652340    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:03.652352    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:03.652360    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:03.652388    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:03.652397    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:03.652405    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:03.652411    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:03.652417    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:03.652423    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:03.652429    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:03.652438    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:05.654362    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 15
	I0731 10:58:05.654375    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:05.654462    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:05.655251    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:05.655313    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:05.655327    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:05.655337    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:05.655346    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:05.655355    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:05.655361    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:05.655368    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:05.655375    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:05.655383    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:05.655398    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:05.655406    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:05.655413    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:05.655421    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:05.655435    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:05.655445    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:05.655459    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:05.655473    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:05.655485    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:05.655494    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:05.655501    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:05.655507    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:05.655521    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:05.655534    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:05.655543    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:05.655551    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:07.655664    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 16
	I0731 10:58:07.655676    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:07.655744    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:07.656534    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:07.656590    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:07.656600    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:07.656608    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:07.656616    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:07.656623    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:07.656629    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:07.656636    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:07.656641    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:07.656653    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:07.656661    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:07.656678    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:07.656686    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:07.656694    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:07.656703    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:07.656711    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:07.656718    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:07.656724    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:07.656732    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:07.656749    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:07.656763    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:07.656772    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:07.656783    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:07.656800    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:07.656808    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:07.656818    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:09.658147    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 17
	I0731 10:58:09.658161    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:09.658230    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:09.659046    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:09.659099    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:09.659112    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:09.659127    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:09.659136    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:09.659143    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:09.659155    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:09.659165    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:09.659174    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:09.659181    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:09.659188    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:09.659196    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:09.659205    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:09.659212    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:09.659220    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:09.659227    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:09.659234    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:09.659249    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:09.659256    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:09.659264    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:09.659270    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:09.659277    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:09.659290    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:09.659300    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:09.659317    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:09.659326    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:11.659668    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 18
	I0731 10:58:11.659683    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:11.659769    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:11.660636    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:11.660653    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:11.660666    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:11.660688    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:11.660704    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:11.660717    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:11.660744    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:11.660754    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:11.660762    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:11.660773    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:11.660783    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:11.660791    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:11.660810    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:11.660827    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:11.660844    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:11.660863    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:11.660881    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:11.660894    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:11.660904    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:11.660928    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:11.660938    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:11.660955    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:11.660964    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:11.660985    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:11.660995    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:11.661003    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:13.661539    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 19
	I0731 10:58:13.661551    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:13.661628    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:13.662442    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:13.662557    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:13.662570    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:13.662580    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:13.662590    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:13.662601    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:13.662608    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:13.662616    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:13.662624    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:13.662631    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:13.662639    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:13.662646    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:13.662654    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:13.662674    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:13.662687    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:13.662694    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:13.662699    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:13.662714    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:13.662721    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:13.662735    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:13.662744    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:13.662751    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:13.662756    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:13.662767    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:13.662780    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:13.662791    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:15.664634    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 20
	I0731 10:58:15.664651    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:15.664736    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:15.665529    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:15.665596    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:15.665607    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:15.665616    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:15.665622    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:15.665629    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:15.665637    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:15.665655    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:15.665671    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:15.665681    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:15.665690    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:15.665699    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:15.665705    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:15.665730    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:15.665739    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:15.665746    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:15.665754    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:15.665762    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:15.665768    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:15.665775    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:15.665792    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:15.665805    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:15.665813    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:15.665821    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:15.665829    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:15.665836    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:17.667788    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 21
	I0731 10:58:17.667801    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:17.667859    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:17.668667    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:17.668719    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:17.668744    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:17.668756    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:17.668767    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:17.668785    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:17.668793    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:17.668802    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:17.668819    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:17.668832    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:17.668845    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:17.668853    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:17.668874    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:17.668886    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:17.668898    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:17.668906    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:17.668916    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:17.668929    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:17.668945    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:17.668958    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:17.668966    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:17.668974    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:17.668995    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:17.669003    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:17.669017    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:17.669026    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:19.670864    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 22
	I0731 10:58:19.670878    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:19.670980    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:19.671755    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:19.671810    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:19.671828    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:19.671838    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:19.671844    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:19.671861    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:19.671872    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:19.671895    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:19.671908    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:19.671916    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:19.671924    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:19.671933    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:19.671941    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:19.671955    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:19.671968    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:19.671976    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:19.671985    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:19.671992    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:19.672004    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:19.672011    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:19.672021    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:19.672027    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:19.672034    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:19.672039    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:19.672046    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:19.672052    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:21.673487    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 23
	I0731 10:58:21.673499    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:21.673535    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:21.674327    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:21.674390    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:21.674401    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:21.674411    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:21.674420    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:21.674427    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:21.674433    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:21.674440    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:21.674446    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:21.674461    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:21.674476    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:21.674484    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:21.674492    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:21.674499    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:21.674504    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:21.674514    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:21.674522    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:21.674535    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:21.674544    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:21.674557    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:21.674572    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:21.674580    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:21.674586    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:21.674608    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:21.674618    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:21.674629    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:23.676517    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 24
	I0731 10:58:23.676533    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:23.676644    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:23.677460    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:23.677474    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:23.677503    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:23.677517    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:23.677525    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:23.677531    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:23.677541    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:23.677549    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:23.677561    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:23.677567    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:23.677574    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:23.677582    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:23.677589    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:23.677597    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:23.677604    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:23.677612    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:23.677628    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:23.677634    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:23.677641    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:23.677648    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:23.677664    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:23.677677    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:23.677687    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:23.677695    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:23.677710    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:23.677722    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:25.677878    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 25
	I0731 10:58:25.677894    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:25.677999    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:25.678788    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:25.678845    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:25.678858    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:25.678893    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:25.678906    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:25.678918    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:25.678926    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:25.678939    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:25.678963    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:25.678969    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:25.678981    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:25.678989    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:25.678998    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:25.679007    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:25.679019    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:25.679027    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:25.679033    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:25.679040    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:25.679051    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:25.679061    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:25.679069    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:25.679076    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:25.679082    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:25.679090    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:25.679096    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:25.679103    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:27.679871    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 26
	I0731 10:58:27.679886    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:27.679990    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:27.680780    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:27.680849    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:27.680861    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:27.680869    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:27.680877    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:27.680890    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:27.680903    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:27.680911    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:27.680918    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:27.680927    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:27.680945    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:27.680960    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:27.680970    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:27.680978    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:27.680985    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:27.680992    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:27.680999    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:27.681007    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:27.681014    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:27.681021    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:27.681028    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:27.681036    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:27.681047    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:27.681054    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:27.681067    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:27.681079    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:29.681013    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 27
	I0731 10:58:29.681028    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:29.681090    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:29.681930    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:29.681943    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:29.681950    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:29.681958    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:29.681966    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:29.681989    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:29.682001    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:29.682010    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:29.682021    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:29.682030    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:29.682035    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:29.682041    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:29.682048    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:29.682066    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:29.682073    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:29.682080    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:29.682088    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:29.682100    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:29.682111    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:29.682120    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:29.682137    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:29.682144    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:29.682152    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:29.682158    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:29.682170    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:29.682183    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:31.683290    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 28
	I0731 10:58:31.683305    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:31.683315    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:31.684158    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:31.684170    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:31.684177    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:31.684192    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:31.684199    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:31.684220    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:31.684234    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:31.684242    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:31.684251    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:31.684264    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:31.684275    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:31.684282    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:31.684290    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:31.684296    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:31.684304    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:31.684311    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:31.684319    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:31.684326    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:31.684333    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:31.684340    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:31.684349    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:31.684363    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:31.684374    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:31.684391    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:31.684403    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:31.684413    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:33.685782    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 29
	I0731 10:58:33.685809    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:33.685892    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:33.686714    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for ea:e9:ee:61:aa:d in /var/db/dhcpd_leases ...
	I0731 10:58:33.686795    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:33.686807    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:33.686834    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:33.686846    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:33.686854    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:33.686860    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:33.686868    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:33.686877    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:33.686884    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:33.686891    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:33.686898    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:33.686906    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:33.686927    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:33.686937    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:33.686952    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:33.686961    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:33.686971    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:33.686982    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:33.686993    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:33.686999    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:33.687006    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:33.687011    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:33.687032    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:33.687044    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:33.687059    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:35.688994    6308 client.go:171] duration metric: took 1m0.99396285s to LocalClient.Create
	I0731 10:58:37.690042    6308 start.go:128] duration metric: took 1m3.027326431s to createHost
	I0731 10:58:37.690057    6308 start.go:83] releasing machines lock for "docker-flags-723000", held for 1m3.027463955s
	W0731 10:58:37.690073    6308 start.go:714] error starting host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:e9:ee:61:aa:d
	I0731 10:58:37.690379    6308 main.go:141] libmachine: Found binary path at /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:58:37.690399    6308 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:58:37.698919    6308 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54747
	I0731 10:58:37.699307    6308 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:58:37.699674    6308 main.go:141] libmachine: Using API Version  1
	I0731 10:58:37.699697    6308 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:58:37.699902    6308 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:58:37.700265    6308 main.go:141] libmachine: Found binary path at /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:58:37.700283    6308 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:58:37.708656    6308 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54749
	I0731 10:58:37.709032    6308 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:58:37.709421    6308 main.go:141] libmachine: Using API Version  1
	I0731 10:58:37.709445    6308 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:58:37.709651    6308 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:58:37.709774    6308 main.go:141] libmachine: (docker-flags-723000) Calling .GetState
	I0731 10:58:37.709861    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:37.709929    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:37.710925    6308 main.go:141] libmachine: (docker-flags-723000) Calling .DriverName
	I0731 10:58:37.732875    6308 out.go:177] * Deleting "docker-flags-723000" in hyperkit ...
	I0731 10:58:37.779265    6308 main.go:141] libmachine: (docker-flags-723000) Calling .Remove
	I0731 10:58:37.779375    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:37.779388    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:37.779440    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:37.780346    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:37.780405    6308 main.go:141] libmachine: (docker-flags-723000) DBG | waiting for graceful shutdown
	I0731 10:58:38.782163    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:38.782229    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:38.783170    6308 main.go:141] libmachine: (docker-flags-723000) DBG | waiting for graceful shutdown
	I0731 10:58:39.784204    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:39.784319    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:39.785873    6308 main.go:141] libmachine: (docker-flags-723000) DBG | waiting for graceful shutdown
	I0731 10:58:40.787964    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:40.788048    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:40.788605    6308 main.go:141] libmachine: (docker-flags-723000) DBG | waiting for graceful shutdown
	I0731 10:58:41.790327    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:41.790423    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:41.790989    6308 main.go:141] libmachine: (docker-flags-723000) DBG | waiting for graceful shutdown
	I0731 10:58:42.793068    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:42.793140    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6329
	I0731 10:58:42.794227    6308 main.go:141] libmachine: (docker-flags-723000) DBG | sending sigkill
	I0731 10:58:42.794241    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	W0731 10:58:42.803866    6308 out.go:239] ! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:e9:ee:61:aa:d
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for ea:e9:ee:61:aa:d
	I0731 10:58:42.803886    6308 start.go:729] Will try again in 5 seconds ...
	I0731 10:58:42.833835    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:42 WARN : hyperkit: failed to read stdout: EOF
	I0731 10:58:42.833856    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:42 WARN : hyperkit: failed to read stderr: EOF
	I0731 10:58:47.805485    6308 start.go:360] acquireMachinesLock for docker-flags-723000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:58:47.805584    6308 start.go:364] duration metric: took 74.617µs to acquireMachinesLock for "docker-flags-723000"
	I0731 10:58:47.805601    6308 start.go:93] Provisioning new machine with config: &{Name:docker-flags-723000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[FOO=BAR BAZ=BAT] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[debug icc=true] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSH
Key: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:docker-flags-723000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:false EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:false apps_running:false default_sa:false extra:false kubelet:false node_ready:false system_pods:false] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountI
P: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:58:47.805673    6308 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 10:58:47.828968    6308 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0731 10:58:47.829031    6308 main.go:141] libmachine: Found binary path at /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:58:47.829046    6308 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:58:47.837618    6308 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54751
	I0731 10:58:47.838179    6308 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:58:47.838509    6308 main.go:141] libmachine: Using API Version  1
	I0731 10:58:47.838520    6308 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:58:47.838725    6308 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:58:47.838839    6308 main.go:141] libmachine: (docker-flags-723000) Calling .GetMachineName
	I0731 10:58:47.838931    6308 main.go:141] libmachine: (docker-flags-723000) Calling .DriverName
	I0731 10:58:47.839040    6308 start.go:159] libmachine.API.Create for "docker-flags-723000" (driver="hyperkit")
	I0731 10:58:47.839054    6308 client.go:168] LocalClient.Create starting
	I0731 10:58:47.839079    6308 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 10:58:47.839131    6308 main.go:141] libmachine: Decoding PEM data...
	I0731 10:58:47.839141    6308 main.go:141] libmachine: Parsing certificate...
	I0731 10:58:47.839182    6308 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 10:58:47.839238    6308 main.go:141] libmachine: Decoding PEM data...
	I0731 10:58:47.839250    6308 main.go:141] libmachine: Parsing certificate...
	I0731 10:58:47.839262    6308 main.go:141] libmachine: Running pre-create checks...
	I0731 10:58:47.839268    6308 main.go:141] libmachine: (docker-flags-723000) Calling .PreCreateCheck
	I0731 10:58:47.839372    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:47.839451    6308 main.go:141] libmachine: (docker-flags-723000) Calling .GetConfigRaw
	I0731 10:58:47.850927    6308 main.go:141] libmachine: Creating machine...
	I0731 10:58:47.850935    6308 main.go:141] libmachine: (docker-flags-723000) Calling .Create
	I0731 10:58:47.851019    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:47.851166    6308 main.go:141] libmachine: (docker-flags-723000) DBG | I0731 10:58:47.851012    6339 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:58:47.851210    6308 main.go:141] libmachine: (docker-flags-723000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 10:58:48.035163    6308 main.go:141] libmachine: (docker-flags-723000) DBG | I0731 10:58:48.035071    6339 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/id_rsa...
	I0731 10:58:48.172840    6308 main.go:141] libmachine: (docker-flags-723000) DBG | I0731 10:58:48.172756    6339 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/docker-flags-723000.rawdisk...
	I0731 10:58:48.172852    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Writing magic tar header
	I0731 10:58:48.172862    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Writing SSH key tar header
	I0731 10:58:48.173215    6308 main.go:141] libmachine: (docker-flags-723000) DBG | I0731 10:58:48.173183    6339 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000 ...
	I0731 10:58:48.555657    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:48.555673    6308 main.go:141] libmachine: (docker-flags-723000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/hyperkit.pid
	I0731 10:58:48.555712    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Using UUID 1f076b7f-bd06-4022-b6e1-042ff4a353e2
	I0731 10:58:48.581964    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Generated MAC 5e:22:a9:b7:d:2e
	I0731 10:58:48.581983    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-723000
	I0731 10:58:48.582009    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"1f076b7f-bd06-4022-b6e1-042ff4a353e2", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", pro
cess:(*os.Process)(nil)}
	I0731 10:58:48.582040    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"1f076b7f-bd06-4022-b6e1-042ff4a353e2", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", pro
cess:(*os.Process)(nil)}
	I0731 10:58:48.582093    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "1f076b7f-bd06-4022-b6e1-042ff4a353e2", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/docker-flags-723000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/bzimage,/Users/jenkins/m
inikube-integration/19349-1046/.minikube/machines/docker-flags-723000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-723000"}
	I0731 10:58:48.582137    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 1f076b7f-bd06-4022-b6e1-042ff4a353e2 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/docker-flags-723000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags
-723000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=docker-flags-723000"
	I0731 10:58:48.582155    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:58:48.585160    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 DEBUG: hyperkit: Pid is 6340
	I0731 10:58:48.585729    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 0
	I0731 10:58:48.585746    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:48.585829    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:58:48.586782    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:58:48.586861    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:48.586878    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:48.586901    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:48.586919    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:48.586949    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:48.586996    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:48.587011    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:48.587043    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:48.587060    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:48.587087    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:48.587107    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:48.587122    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:48.587144    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:48.587156    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:48.587198    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:48.587219    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:48.587232    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:48.587245    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:48.587258    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:48.587272    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:48.587285    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:48.587299    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:48.587312    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:48.587336    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:48.587360    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:48.592665    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:58:48.601047    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/docker-flags-723000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:58:48.601963    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:58:48.601995    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:58:48.602019    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:58:48.602031    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:58:48.981810    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:58:48.981827    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:48 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:58:49.096540    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:49 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:58:49.096561    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:49 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:58:49.096585    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:49 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:58:49.096624    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:49 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:58:49.097384    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:49 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:58:49.097394    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:49 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:58:50.588663    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 1
	I0731 10:58:50.588680    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:50.588800    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:58:50.589631    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:58:50.589704    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:50.589711    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:50.589722    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:50.589728    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:50.589734    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:50.589741    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:50.589755    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:50.589767    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:50.589776    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:50.589786    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:50.589810    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:50.589818    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:50.589827    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:50.589835    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:50.589842    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:50.589851    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:50.589858    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:50.589864    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:50.589874    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:50.589884    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:50.589894    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:50.589909    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:50.589943    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:50.589953    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:50.589967    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:52.590572    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 2
	I0731 10:58:52.590595    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:52.590719    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:58:52.591609    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:58:52.591676    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:52.591687    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:52.591723    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:52.591758    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:52.591788    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:52.591798    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:52.591810    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:52.591817    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:52.591825    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:52.591838    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:52.591853    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:52.591863    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:52.591871    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:52.591888    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:52.591897    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:52.591904    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:52.591912    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:52.591939    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:52.591953    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:52.591965    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:52.591975    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:52.591987    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:52.591993    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:52.592001    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:52.592010    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:54.453415    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:54 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 10:58:54.453594    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:54 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 10:58:54.453606    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:54 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 10:58:54.473527    6308 main.go:141] libmachine: (docker-flags-723000) DBG | 2024/07/31 10:58:54 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 10:58:54.592812    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 3
	I0731 10:58:54.592829    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:54.592883    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:58:54.593688    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:58:54.593743    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:54.593751    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:54.593759    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:54.593772    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:54.593782    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:54.593791    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:54.593798    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:54.593805    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:54.593820    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:54.593832    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:54.593842    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:54.593850    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:54.593857    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:54.593866    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:54.593872    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:54.593878    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:54.593885    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:54.593892    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:54.593899    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:54.593912    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:54.593925    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:54.593933    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:54.593940    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:54.593946    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:54.593953    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:56.595571    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 4
	I0731 10:58:56.595589    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:56.595688    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:58:56.596533    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:58:56.596610    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:56.596619    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:56.596646    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:56.596659    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:56.596677    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:56.596691    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:56.596703    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:56.596713    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:56.596729    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:56.596738    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:56.596748    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:56.596757    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:56.596763    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:56.596773    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:56.596781    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:56.596790    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:56.596797    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:56.596805    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:56.596811    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:56.596821    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:56.596828    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:56.596834    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:56.596856    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:56.596869    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:56.596879    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:58:58.598765    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 5
	I0731 10:58:58.598782    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:58:58.598894    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:58:58.599732    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:58:58.599760    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:58:58.599774    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:58:58.599787    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:58:58.599796    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:58:58.599803    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:58:58.599810    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:58:58.599825    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:58:58.599839    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:58:58.599849    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:58:58.599864    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:58:58.599887    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:58:58.599899    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:58:58.599907    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:58:58.599915    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:58:58.599922    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:58:58.599929    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:58:58.599939    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:58:58.599949    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:58:58.599957    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:58:58.599965    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:58:58.599972    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:58:58.599980    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:58:58.599996    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:58:58.600010    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:58:58.600024    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:00.601153    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 6
	I0731 10:59:00.601170    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:00.601285    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:00.602100    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:00.602162    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:00.602171    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:00.602194    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:00.602215    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:00.602225    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:00.602232    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:00.602239    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:00.602245    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:00.602262    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:00.602274    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:00.602282    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:00.602288    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:00.602307    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:00.602326    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:00.602338    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:00.602350    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:00.602358    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:00.602366    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:00.602378    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:00.602386    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:00.602394    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:00.602401    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:00.602408    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:00.602417    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:00.602426    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:02.604189    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 7
	I0731 10:59:02.604211    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:02.604313    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:02.605128    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:02.605165    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:02.605176    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:02.605185    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:02.605191    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:02.605207    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:02.605214    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:02.605221    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:02.605227    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:02.605235    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:02.605242    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:02.605248    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:02.605256    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:02.605275    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:02.605288    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:02.605296    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:02.605304    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:02.605312    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:02.605319    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:02.605324    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:02.605332    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:02.605339    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:02.605346    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:02.605353    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:02.605362    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:02.605370    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:04.607339    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 8
	I0731 10:59:04.607352    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:04.607429    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:04.608276    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:04.608299    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:04.608309    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:04.608319    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:04.608326    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:04.608338    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:04.608345    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:04.608351    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:04.608359    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:04.608366    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:04.608371    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:04.608392    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:04.608404    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:04.608419    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:04.608427    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:04.608435    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:04.608442    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:04.608449    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:04.608456    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:04.608463    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:04.608472    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:04.608479    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:04.608485    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:04.608497    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:04.608509    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:04.608519    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:06.609360    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 9
	I0731 10:59:06.609374    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:06.609424    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:06.610261    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:06.610283    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:06.610302    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:06.610315    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:06.610323    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:06.610334    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:06.610344    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:06.610350    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:06.610356    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:06.610362    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:06.610368    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:06.610374    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:06.610388    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:06.610394    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:06.610408    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:06.610432    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:06.610444    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:06.610456    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:06.610464    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:06.610473    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:06.610481    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:06.610492    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:06.610499    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:06.610506    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:06.610511    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:06.610518    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:08.610437    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 10
	I0731 10:59:08.610454    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:08.610553    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:08.611361    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:08.611429    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:08.611439    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:08.611452    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:08.611465    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:08.611475    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:08.611485    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:08.611495    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:08.611503    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:08.611511    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:08.611516    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:08.611524    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:08.611533    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:08.611541    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:08.611550    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:08.611562    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:08.611577    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:08.611592    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:08.611602    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:08.611611    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:08.611617    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:08.611623    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:08.611636    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:08.611644    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:08.611650    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:08.611658    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:10.612003    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 11
	I0731 10:59:10.612019    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:10.612075    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:10.612878    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:10.612942    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:10.612954    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:10.612963    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:10.612969    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:10.612978    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:10.612985    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:10.612993    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:10.613000    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:10.613014    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:10.613022    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:10.613029    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:10.613035    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:10.613041    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:10.613049    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:10.613055    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:10.613063    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:10.613070    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:10.613077    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:10.613107    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:10.613118    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:10.613126    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:10.613136    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:10.613146    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:10.613159    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:10.613168    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:12.613228    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 12
	I0731 10:59:12.613242    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:12.613301    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:12.614158    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:12.614231    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:12.614245    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:12.614258    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:12.614275    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:12.614299    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:12.614319    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:12.614332    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:12.614339    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:12.614346    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:12.614359    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:12.614372    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:12.614385    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:12.614393    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:12.614399    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:12.614408    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:12.614415    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:12.614422    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:12.614431    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:12.614438    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:12.614445    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:12.614452    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:12.614459    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:12.614466    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:12.614473    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:12.614481    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:14.614393    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 13
	I0731 10:59:14.614408    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:14.614532    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:14.615318    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:14.615376    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:14.615388    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:14.615397    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:14.615406    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:14.615413    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:14.615419    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:14.615430    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:14.615437    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:14.615450    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:14.615458    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:14.615465    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:14.615471    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:14.615496    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:14.615509    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:14.615531    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:14.615540    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:14.615549    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:14.615556    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:14.615564    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:14.615585    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:14.615597    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:14.615605    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:14.615612    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:14.615619    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:14.615628    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:16.615787    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 14
	I0731 10:59:16.615802    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:16.615917    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:16.616723    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:16.616768    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:16.616786    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:16.616794    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:16.616809    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:16.616826    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:16.616838    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:16.616848    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:16.616855    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:16.616863    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:16.616869    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:16.616877    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:16.616890    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:16.616898    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:16.616905    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:16.616916    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:16.616924    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:16.616931    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:16.616938    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:16.616944    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:16.616950    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:16.616960    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:16.616966    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:16.616972    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:16.616978    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:16.616986    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:18.618882    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 15
	I0731 10:59:18.618900    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:18.618969    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:18.619788    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:18.619839    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:18.619849    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:18.619857    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:18.619863    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:18.619869    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:18.619875    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:18.619880    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:18.619896    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:18.619903    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:18.619909    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:18.619915    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:18.619922    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:18.619932    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:18.619946    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:18.619961    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:18.619974    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:18.619982    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:18.619990    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:18.620000    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:18.620008    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:18.620016    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:18.620024    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:18.620043    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:18.620052    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:18.620059    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:20.621316    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 16
	I0731 10:59:20.621329    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:20.621383    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:20.622270    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:20.622291    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:20.622300    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:20.622308    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:20.622313    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:20.622322    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:20.622332    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:20.622363    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:20.622377    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:20.622387    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:20.622409    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:20.622420    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:20.622429    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:20.622438    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:20.622446    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:20.622457    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:20.622465    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:20.622473    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:20.622483    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:20.622490    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:20.622498    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:20.622504    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:20.622512    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:20.622519    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:20.622525    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:20.622541    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:22.623085    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 17
	I0731 10:59:22.623101    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:22.623197    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:22.624009    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:22.624052    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:22.624070    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:22.624086    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:22.624103    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:22.624113    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:22.624119    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:22.624151    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:22.624161    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:22.624168    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:22.624176    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:22.624183    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:22.624189    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:22.624195    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:22.624203    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:22.624211    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:22.624219    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:22.624226    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:22.624233    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:22.624240    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:22.624247    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:22.624254    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:22.624260    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:22.624271    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:22.624283    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:22.624292    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:24.625082    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 18
	I0731 10:59:24.625095    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:24.625126    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:24.625953    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:24.625987    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:24.625997    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:24.626024    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:24.626035    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:24.626043    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:24.626051    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:24.626063    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:24.626069    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:24.626077    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:24.626092    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:24.626101    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:24.626106    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:24.626116    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:24.626125    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:24.626133    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:24.626138    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:24.626153    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:24.626163    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:24.626173    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:24.626181    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:24.626193    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:24.626200    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:24.626211    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:24.626219    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:24.626227    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:26.627950    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 19
	I0731 10:59:26.627963    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:26.628064    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:26.628866    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:26.628921    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:26.628929    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:26.628937    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:26.628943    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:26.628948    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:26.628953    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:26.628960    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:26.628966    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:26.628990    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:26.629003    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:26.629015    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:26.629022    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:26.629033    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:26.629042    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:26.629049    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:26.629057    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:26.629063    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:26.629074    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:26.629082    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:26.629090    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:26.629102    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:26.629109    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:26.629116    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:26.629124    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:26.629132    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:28.630518    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 20
	I0731 10:59:28.630532    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:28.630542    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:28.631424    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:28.631484    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:28.631499    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:28.631509    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:28.631517    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:28.631525    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:28.631538    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:28.631552    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:28.631560    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:28.631569    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:28.631576    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:28.631582    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:28.631599    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:28.631608    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:28.631616    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:28.631629    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:28.631636    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:28.631645    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:28.631652    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:28.631658    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:28.631679    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:28.631694    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:28.631703    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:28.631709    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:28.631724    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:28.631738    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:30.632497    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 21
	I0731 10:59:30.632513    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:30.632630    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:30.633420    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:30.633475    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:30.633488    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:30.633500    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:30.633509    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:30.633516    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:30.633525    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:30.633535    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:30.633542    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:30.633552    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:30.633560    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:30.633576    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:30.633585    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:30.633592    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:30.633598    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:30.633612    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:30.633628    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:30.633636    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:30.633644    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:30.633651    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:30.633664    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:30.633671    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:30.633676    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:30.633684    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:30.633691    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:30.633696    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:32.634386    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 22
	I0731 10:59:32.634399    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:32.634516    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:32.635399    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:32.635441    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:32.635456    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:32.635480    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:32.635491    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:32.635501    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:32.635511    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:32.635518    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:32.635525    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:32.635535    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:32.635543    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:32.635550    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:32.635556    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:32.635562    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:32.635570    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:32.635578    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:32.635583    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:32.635589    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:32.635602    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:32.635618    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:32.635627    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:32.635636    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:32.635644    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:32.635652    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:32.635659    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:32.635667    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:34.637418    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 23
	I0731 10:59:34.637431    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:34.637516    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:34.638409    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:34.638493    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:34.638518    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:34.638527    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:34.638533    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:34.638541    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:34.638549    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:34.638560    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:34.638569    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:34.638587    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:34.638600    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:34.638614    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:34.638630    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:34.638638    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:34.638646    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:34.638653    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:34.638660    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:34.638667    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:34.638676    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:34.638692    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:34.638705    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:34.638716    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:34.638724    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:34.638731    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:34.638739    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:34.638748    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:36.640656    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 24
	I0731 10:59:36.640680    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:36.640738    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:36.641587    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:36.641615    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:36.641631    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:36.641640    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:36.641645    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:36.641652    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:36.641667    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:36.641676    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:36.641689    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:36.641701    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:36.641709    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:36.641727    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:36.641741    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:36.641750    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:36.641757    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:36.641764    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:36.641776    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:36.641786    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:36.641794    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:36.641800    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:36.641806    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:36.641815    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:36.641820    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:36.641827    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:36.641835    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:36.641847    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:38.643786    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 25
	I0731 10:59:38.643802    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:38.643864    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:38.644736    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:38.644798    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:38.644810    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:38.644819    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:38.644826    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:38.644835    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:38.644844    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:38.644851    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:38.644864    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:38.644872    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:38.644878    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:38.644884    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:38.644895    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:38.644903    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:38.644910    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:38.644926    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:38.644937    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:38.644946    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:38.644953    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:38.644973    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:38.644986    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:38.644994    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:38.645005    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:38.645013    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:38.645020    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:38.645029    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:40.646917    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 26
	I0731 10:59:40.646936    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:40.647036    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:40.647859    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:40.647903    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:40.647914    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:40.647923    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:40.647931    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:40.647943    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:40.647957    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:40.647966    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:40.647973    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:40.647980    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:40.647988    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:40.647995    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:40.648002    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:40.648011    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:40.648023    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:40.648030    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:40.648043    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:40.648054    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:40.648068    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:40.648076    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:40.648084    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:40.648093    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:40.648112    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:40.648121    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:40.648131    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:40.648141    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:42.649336    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 27
	I0731 10:59:42.649354    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:42.649421    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:42.650239    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:42.650295    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:42.650306    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:42.650315    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:42.650323    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:42.650339    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:42.650346    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:42.650352    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:42.650359    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:42.650365    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:42.650372    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:42.650378    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:42.650393    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:42.650404    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:42.650413    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:42.650421    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:42.650440    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:42.650456    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:42.650470    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:42.650481    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:42.650489    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:42.650497    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:42.650509    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:42.650517    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:42.650524    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:42.650532    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:44.650831    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 28
	I0731 10:59:44.650858    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:44.650897    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:44.651728    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:44.651745    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:44.651752    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:44.651762    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:44.651768    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:44.651776    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:44.651785    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:44.651795    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:44.651802    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:44.651819    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:44.651832    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:44.651841    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:44.651847    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:44.651860    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:44.651869    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:44.651884    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:44.651895    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:44.651903    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:44.651911    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:44.651920    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:44.651928    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:44.651936    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:44.651944    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:44.651951    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:44.651957    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:44.651962    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:46.651885    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Attempt 29
	I0731 10:59:46.651898    6308 main.go:141] libmachine: (docker-flags-723000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:59:46.651959    6308 main.go:141] libmachine: (docker-flags-723000) DBG | hyperkit pid from json: 6340
	I0731 10:59:46.652801    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Searching for 5e:22:a9:b7:d:2e in /var/db/dhcpd_leases ...
	I0731 10:59:46.652813    6308 main.go:141] libmachine: (docker-flags-723000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:59:46.652819    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:59:46.652827    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:59:46.652833    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:59:46.652848    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:59:46.652865    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:59:46.652880    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:59:46.652889    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:59:46.652896    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:59:46.652901    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:59:46.652915    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:59:46.652928    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:59:46.652937    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:59:46.652945    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:59:46.652953    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:59:46.652961    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:59:46.652967    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:59:46.652975    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:59:46.652990    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:59:46.653003    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:59:46.653012    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:59:46.653018    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:59:46.653024    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:59:46.653048    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:59:46.653059    6308 main.go:141] libmachine: (docker-flags-723000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:59:48.654995    6308 client.go:171] duration metric: took 1m0.817202412s to LocalClient.Create
	I0731 10:59:50.656334    6308 start.go:128] duration metric: took 1m2.851962428s to createHost
	I0731 10:59:50.656393    6308 start.go:83] releasing machines lock for "docker-flags-723000", held for 1m2.852112511s
	W0731 10:59:50.656458    6308 out.go:239] * Failed to start hyperkit VM. Running "minikube delete -p docker-flags-723000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 5e:22:a9:b7:d:2e
	* Failed to start hyperkit VM. Running "minikube delete -p docker-flags-723000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 5e:22:a9:b7:d:2e
	I0731 10:59:50.678350    6308 out.go:177] 
	W0731 10:59:50.699694    6308 out.go:239] X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 5e:22:a9:b7:d:2e
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 5e:22:a9:b7:d:2e
	W0731 10:59:50.699709    6308 out.go:239] * 
	* 
	W0731 10:59:50.700372    6308 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:59:50.762687    6308 out.go:177] 

                                                
                                                
** /stderr **
docker_test.go:53: failed to start minikube with args: "out/minikube-darwin-amd64 start -p docker-flags-723000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit " : exit status 80
docker_test.go:56: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-723000 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:56: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p docker-flags-723000 ssh "sudo systemctl show docker --property=Environment --no-pager": exit status 50 (163.612521ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node docker-flags-723000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
docker_test.go:58: failed to 'systemctl show docker' inside minikube. args "out/minikube-darwin-amd64 -p docker-flags-723000 ssh \"sudo systemctl show docker --property=Environment --no-pager\"": exit status 50
docker_test.go:63: expected env key/value "FOO=BAR" to be passed to minikube's docker and be included in: *"\n\n"*.
docker_test.go:63: expected env key/value "BAZ=BAT" to be passed to minikube's docker and be included in: *"\n\n"*.
docker_test.go:67: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-723000 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
docker_test.go:67: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p docker-flags-723000 ssh "sudo systemctl show docker --property=ExecStart --no-pager": exit status 50 (162.042492ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node docker-flags-723000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
docker_test.go:69: failed on the second 'systemctl show docker' inside minikube. args "out/minikube-darwin-amd64 -p docker-flags-723000 ssh \"sudo systemctl show docker --property=ExecStart --no-pager\"": exit status 50
docker_test.go:73: expected "out/minikube-darwin-amd64 -p docker-flags-723000 ssh \"sudo systemctl show docker --property=ExecStart --no-pager\"" output to have include *--debug* . output: "\n\n"
panic.go:626: *** TestDockerFlags FAILED at 2024-07-31 10:59:51.21891 -0700 PDT m=+4836.320509594
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p docker-flags-723000 -n docker-flags-723000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p docker-flags-723000 -n docker-flags-723000: exit status 7 (77.447096ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 10:59:51.294694    6354 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 10:59:51.294717    6354 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "docker-flags-723000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:175: Cleaning up "docker-flags-723000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-723000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-723000: (5.23948904s)
--- FAIL: TestDockerFlags (199.25s)

                                                
                                    
x
+
TestForceSystemdFlag (149.88s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-854000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:91: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p force-systemd-flag-854000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : exit status 80 (2m24.305084543s)

                                                
                                                
-- stdout --
	* [force-systemd-flag-854000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "force-systemd-flag-854000" primary control-plane node in "force-systemd-flag-854000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "force-systemd-flag-854000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:50:32.170386    6097 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:50:32.170558    6097 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:50:32.170564    6097 out.go:304] Setting ErrFile to fd 2...
	I0731 10:50:32.170568    6097 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:50:32.170743    6097 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:50:32.172261    6097 out.go:298] Setting JSON to false
	I0731 10:50:32.194717    6097 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4802,"bootTime":1722443430,"procs":465,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:50:32.194817    6097 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:50:32.217160    6097 out.go:177] * [force-systemd-flag-854000] minikube v1.33.1 on Darwin 14.5
	I0731 10:50:32.262936    6097 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:50:32.262974    6097 notify.go:220] Checking for updates...
	I0731 10:50:32.306738    6097 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:50:32.327507    6097 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:50:32.348910    6097 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:50:32.369987    6097 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:50:32.390727    6097 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:50:32.412767    6097 config.go:182] Loaded profile config "NoKubernetes-782000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v0.0.0
	I0731 10:50:32.412952    6097 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:50:32.442880    6097 out.go:177] * Using the hyperkit driver based on user configuration
	I0731 10:50:32.484829    6097 start.go:297] selected driver: hyperkit
	I0731 10:50:32.484857    6097 start.go:901] validating driver "hyperkit" against <nil>
	I0731 10:50:32.484879    6097 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:50:32.489333    6097 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:50:32.489451    6097 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:50:32.497812    6097 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:50:32.501842    6097 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:50:32.501873    6097 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:50:32.501906    6097 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 10:50:32.502100    6097 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0731 10:50:32.502123    6097 cni.go:84] Creating CNI manager for ""
	I0731 10:50:32.502142    6097 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0731 10:50:32.502148    6097 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0731 10:50:32.502207    6097 start.go:340] cluster config:
	{Name:force-systemd-flag-854000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:force-systemd-flag-854000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:clus
ter.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:50:32.502288    6097 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:50:32.544619    6097 out.go:177] * Starting "force-systemd-flag-854000" primary control-plane node in "force-systemd-flag-854000" cluster
	I0731 10:50:32.565982    6097 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:50:32.566057    6097 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:50:32.566096    6097 cache.go:56] Caching tarball of preloaded images
	I0731 10:50:32.566322    6097 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:50:32.566340    6097 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:50:32.566479    6097 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/force-systemd-flag-854000/config.json ...
	I0731 10:50:32.566512    6097 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/force-systemd-flag-854000/config.json: {Name:mka6cfe1c8a26b47d165ed89ffbb4b6ab378785f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:50:32.567120    6097 start.go:360] acquireMachinesLock for force-systemd-flag-854000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:50:32.567249    6097 start.go:364] duration metric: took 97.818µs to acquireMachinesLock for "force-systemd-flag-854000"
	I0731 10:50:32.567289    6097 start.go:93] Provisioning new machine with config: &{Name:force-systemd-flag-854000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kuberne
tesConfig:{KubernetesVersion:v1.30.3 ClusterName:force-systemd-flag-854000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disable
Optimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:50:32.567376    6097 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 10:50:32.609703    6097 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0731 10:50:32.609967    6097 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:50:32.610040    6097 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:50:32.620199    6097 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54544
	I0731 10:50:32.620604    6097 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:50:32.621029    6097 main.go:141] libmachine: Using API Version  1
	I0731 10:50:32.621041    6097 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:50:32.621275    6097 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:50:32.621384    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .GetMachineName
	I0731 10:50:32.621486    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .DriverName
	I0731 10:50:32.621570    6097 start.go:159] libmachine.API.Create for "force-systemd-flag-854000" (driver="hyperkit")
	I0731 10:50:32.621599    6097 client.go:168] LocalClient.Create starting
	I0731 10:50:32.621632    6097 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 10:50:32.621685    6097 main.go:141] libmachine: Decoding PEM data...
	I0731 10:50:32.621706    6097 main.go:141] libmachine: Parsing certificate...
	I0731 10:50:32.621758    6097 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 10:50:32.621796    6097 main.go:141] libmachine: Decoding PEM data...
	I0731 10:50:32.621811    6097 main.go:141] libmachine: Parsing certificate...
	I0731 10:50:32.621823    6097 main.go:141] libmachine: Running pre-create checks...
	I0731 10:50:32.621832    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .PreCreateCheck
	I0731 10:50:32.621909    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:32.622080    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .GetConfigRaw
	I0731 10:50:32.622573    6097 main.go:141] libmachine: Creating machine...
	I0731 10:50:32.622584    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .Create
	I0731 10:50:32.622662    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:32.622786    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | I0731 10:50:32.622657    6105 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:50:32.622842    6097 main.go:141] libmachine: (force-systemd-flag-854000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 10:50:32.804821    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | I0731 10:50:32.804762    6105 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/id_rsa...
	I0731 10:50:32.857525    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | I0731 10:50:32.857449    6105 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/force-systemd-flag-854000.rawdisk...
	I0731 10:50:32.857536    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Writing magic tar header
	I0731 10:50:32.857547    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Writing SSH key tar header
	I0731 10:50:32.860640    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | I0731 10:50:32.860516    6105 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000 ...
	I0731 10:50:33.238766    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:33.238785    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/hyperkit.pid
	I0731 10:50:33.238798    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Using UUID cb0f9bd9-64c2-4c03-803d-34365c6fd8f4
	I0731 10:50:33.264173    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Generated MAC e:f7:a8:21:98:73
	I0731 10:50:33.264194    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-854000
	I0731 10:50:33.264248    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"cb0f9bd9-64c2-4c03-803d-34365c6fd8f4", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001e0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:
[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:50:33.264287    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"cb0f9bd9-64c2-4c03-803d-34365c6fd8f4", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001e0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:
[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:50:33.264339    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "cb0f9bd9-64c2-4c03-803d-34365c6fd8f4", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/force-systemd-flag-854000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/fo
rce-systemd-flag-854000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-854000"}
	I0731 10:50:33.264386    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U cb0f9bd9-64c2-4c03-803d-34365c6fd8f4 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/force-systemd-flag-854000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/bzimage,/Users/jenkins/minikube-integr
ation/19349-1046/.minikube/machines/force-systemd-flag-854000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-854000"
	I0731 10:50:33.264398    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:50:33.267600    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 DEBUG: hyperkit: Pid is 6107
	I0731 10:50:33.268791    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 0
	I0731 10:50:33.268816    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:33.268892    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:33.269860    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:33.269932    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:33.269948    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66abca9f}
	I0731 10:50:33.269958    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:33.269968    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:33.269975    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:33.269983    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:33.270000    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:33.270019    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:33.270032    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:33.270061    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:33.270077    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:33.270094    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:33.270105    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:33.270122    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:33.270131    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:33.270138    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:33.270147    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:33.270155    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:33.270162    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:33.270172    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:33.270180    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:33.270193    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:33.270227    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:33.270239    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:33.275073    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:50:33.283022    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:50:33.283849    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:50:33.283875    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:50:33.283916    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:50:33.283929    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:50:33.664098    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:50:33.664113    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:50:33.779034    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:50:33.779054    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:50:33.779065    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:50:33.779075    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:50:33.779925    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:50:33.779937    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:50:35.271781    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 1
	I0731 10:50:35.271798    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:35.271859    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:35.272704    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:35.272750    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:35.272763    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66abca9f}
	I0731 10:50:35.272782    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:35.272789    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:35.272796    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:35.272804    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:35.272863    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:35.272887    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:35.272899    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:35.272908    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:35.272915    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:35.272921    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:35.272928    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:35.272940    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:35.272948    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:35.272956    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:35.272964    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:35.272970    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:35.272977    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:35.272984    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:35.272989    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:35.272994    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:35.273000    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:35.273007    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:37.274635    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 2
	I0731 10:50:37.274653    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:37.274742    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:37.275578    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:37.275622    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:37.275636    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66abca9f}
	I0731 10:50:37.275649    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:37.275669    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:37.275678    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:37.275687    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:37.275693    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:37.275701    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:37.275708    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:37.275715    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:37.275728    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:37.275740    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:37.275757    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:37.275771    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:37.275780    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:37.275787    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:37.275794    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:37.275802    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:37.275809    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:37.275817    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:37.275842    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:37.275861    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:37.275870    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:37.275876    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:39.133082    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:39 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 10:50:39.133202    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:39 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 10:50:39.133213    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:39 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 10:50:39.153158    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:50:39 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 10:50:39.277297    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 3
	I0731 10:50:39.277320    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:39.277555    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:39.279140    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:39.279287    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:39.279303    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66abca9f}
	I0731 10:50:39.279325    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:39.279336    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:39.279345    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:39.279356    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:39.279390    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:39.279410    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:39.279422    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:39.279432    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:39.279459    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:39.279472    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:39.279482    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:39.279493    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:39.279509    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:39.279523    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:39.279533    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:39.279544    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:39.279553    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:39.279561    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:39.279571    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:39.279582    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:39.279591    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:39.279602    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:41.279461    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 4
	I0731 10:50:41.279479    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:41.279513    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:41.280437    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:41.280446    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:41.280454    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66abca9f}
	I0731 10:50:41.280460    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:41.280502    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:41.280527    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:41.280540    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:41.280553    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:41.280562    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:41.280571    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:41.280587    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:41.280595    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:41.280604    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:41.280611    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:41.280627    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:41.280636    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:41.280644    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:41.280651    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:41.280659    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:41.280666    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:41.280674    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:41.280680    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:41.280687    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:41.280693    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:41.280701    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:43.282624    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 5
	I0731 10:50:43.282641    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:43.282652    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:43.283540    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:43.283593    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:43.283608    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:43.283624    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:43.283635    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:43.283648    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:43.283656    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:43.283665    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:43.283674    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:43.283683    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:43.283702    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:43.283711    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:43.283717    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:43.283725    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:43.283731    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:43.283738    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:43.283762    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:43.283776    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:43.283790    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:43.283800    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:43.283814    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:43.283827    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:43.283837    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:43.283845    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:43.283865    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:45.283733    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 6
	I0731 10:50:45.283764    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:45.283788    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:45.284698    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:45.284766    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:45.284779    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:45.284803    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:45.284834    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:45.284856    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:45.284873    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:45.284885    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:45.284893    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:45.284902    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:45.284908    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:45.284915    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:45.284922    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:45.284929    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:45.284937    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:45.284945    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:45.284951    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:45.284957    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:45.284963    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:45.284971    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:45.284986    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:45.284999    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:45.285006    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:45.285013    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:45.285020    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:47.286884    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 7
	I0731 10:50:47.286900    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:47.286942    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:47.287764    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:47.287800    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:47.287809    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:47.287819    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:47.287827    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:47.287839    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:47.287847    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:47.287853    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:47.287860    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:47.287866    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:47.287885    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:47.287896    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:47.287918    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:47.287931    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:47.287938    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:47.287947    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:47.287954    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:47.287960    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:47.287966    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:47.287974    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:47.287980    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:47.287987    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:47.287993    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:47.288000    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:47.288008    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:49.289929    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 8
	I0731 10:50:49.289945    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:49.289993    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:49.290814    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:49.290862    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:49.290874    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:49.290883    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:49.290890    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:49.290911    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:49.290922    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:49.290929    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:49.290937    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:49.290957    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:49.290965    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:49.290974    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:49.290980    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:49.290987    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:49.290995    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:49.291002    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:49.291016    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:49.291035    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:49.291047    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:49.291055    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:49.291072    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:49.291085    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:49.291104    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:49.291121    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:49.291130    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:51.291132    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 9
	I0731 10:50:51.291150    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:51.291252    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:51.292068    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:51.292119    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:51.292130    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:51.292143    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:51.292150    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:51.292156    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:51.292162    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:51.292170    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:51.292177    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:51.292184    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:51.292190    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:51.292207    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:51.292241    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:51.292253    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:51.292260    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:51.292268    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:51.292279    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:51.292289    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:51.292296    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:51.292304    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:51.292318    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:51.292330    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:51.292339    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:51.292347    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:51.292356    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:53.292296    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 10
	I0731 10:50:53.292322    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:53.292435    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:53.293297    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:53.293328    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:53.293340    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:53.293351    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:53.293377    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:53.293387    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:53.293397    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:53.293414    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:53.293427    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:53.293443    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:53.293455    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:53.293468    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:53.293475    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:53.293490    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:53.293505    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:53.293518    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:53.293526    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:53.293534    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:53.293539    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:53.293547    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:53.293555    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:53.293562    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:53.293569    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:53.293576    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:53.293582    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:55.295505    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 11
	I0731 10:50:55.295521    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:55.295581    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:55.296501    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:55.296557    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:55.296568    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:55.296585    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:55.296614    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:55.296625    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:55.296634    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:55.296641    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:55.296647    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:55.296659    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:55.296672    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:55.296680    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:55.296688    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:55.296695    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:55.296703    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:55.296713    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:55.296721    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:55.296733    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:55.296742    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:55.296751    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:55.296758    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:55.296770    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:55.296782    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:55.296797    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:55.296810    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:57.298361    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 12
	I0731 10:50:57.298376    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:57.298470    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:57.299344    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:57.299385    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:57.299394    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:57.299401    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:57.299407    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:57.299419    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:57.299427    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:57.299434    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:57.299443    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:57.299449    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:57.299456    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:57.299464    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:57.299485    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:57.299498    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:57.299508    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:57.299521    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:57.299529    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:57.299537    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:57.299548    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:57.299557    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:57.299564    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:57.299573    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:57.299580    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:57.299588    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:57.299597    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:50:59.299804    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 13
	I0731 10:50:59.299819    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:50:59.299867    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:50:59.300673    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:50:59.300737    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:50:59.300747    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:50:59.300755    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:50:59.300761    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:50:59.300789    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:50:59.300798    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:50:59.300805    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:50:59.300813    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:50:59.300820    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:50:59.300830    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:50:59.300837    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:50:59.300855    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:50:59.300861    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:50:59.300869    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:50:59.300876    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:50:59.300884    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:50:59.300891    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:50:59.300898    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:50:59.300907    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:50:59.300915    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:50:59.300922    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:50:59.300930    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:50:59.300937    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:50:59.300945    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:01.302906    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 14
	I0731 10:51:01.302922    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:01.302991    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:01.303825    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:01.303860    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:01.303874    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:01.303886    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:01.303893    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:01.303919    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:01.303935    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:01.303949    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:01.303961    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:01.303969    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:01.303976    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:01.303983    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:01.303992    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:01.304004    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:01.304014    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:01.304024    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:01.304032    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:01.304039    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:01.304047    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:01.304054    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:01.304061    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:01.304067    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:01.304073    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:01.304081    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:01.304089    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:03.304819    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 15
	I0731 10:51:03.304834    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:03.304910    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:03.305739    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:03.305781    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:03.305793    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:03.305801    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:03.305809    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:03.305822    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:03.305835    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:03.305844    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:03.305852    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:03.305860    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:03.305866    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:03.305884    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:03.305893    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:03.305900    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:03.305910    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:03.305916    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:03.305924    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:03.305931    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:03.305939    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:03.305953    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:03.305967    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:03.305985    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:03.305999    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:03.306007    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:03.306015    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:05.306686    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 16
	I0731 10:51:05.306699    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:05.306765    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:05.307607    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:05.307620    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:05.307645    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:05.307656    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:05.307665    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:05.307673    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:05.307692    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:05.307701    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:05.307709    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:05.307716    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:05.307722    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:05.307730    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:05.307739    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:05.307748    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:05.307755    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:05.307761    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:05.307768    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:05.307785    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:05.307798    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:05.307808    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:05.307817    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:05.307832    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:05.307840    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:05.307847    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:05.307855    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:07.309749    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 17
	I0731 10:51:07.309763    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:07.309806    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:07.310644    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:07.310678    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:07.310686    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:07.310697    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:07.310703    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:07.310710    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:07.310716    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:07.310723    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:07.310730    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:07.310737    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:07.310748    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:07.310755    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:07.310761    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:07.310767    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:07.310775    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:07.310783    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:07.310795    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:07.310810    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:07.310822    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:07.310831    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:07.310838    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:07.310851    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:07.310859    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:07.310867    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:07.310880    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:09.312118    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 18
	I0731 10:51:09.312140    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:09.312271    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:09.313100    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:09.313144    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:09.313157    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:09.313176    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:09.313187    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:09.313195    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:09.313203    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:09.313225    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:09.313237    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:09.313246    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:09.313254    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:09.313261    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:09.313269    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:09.313282    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:09.313291    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:09.313298    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:09.313305    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:09.313321    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:09.313329    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:09.313338    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:09.313345    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:09.313359    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:09.313372    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:09.313380    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:09.313398    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:11.314176    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 19
	I0731 10:51:11.314207    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:11.314275    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:11.315159    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:11.315204    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:11.315215    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:11.315241    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:11.315249    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:11.315259    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:11.315269    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:11.315275    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:11.315282    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:11.315288    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:11.315298    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:11.315304    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:11.315311    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:11.315319    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:11.315326    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:11.315334    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:11.315341    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:11.315353    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:11.315362    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:11.315370    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:11.315381    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:11.315388    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:11.315394    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:11.315401    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:11.315408    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:13.316426    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 20
	I0731 10:51:13.316441    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:13.316543    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:13.317440    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:13.317502    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:13.317512    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:13.317521    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:13.317531    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:13.317538    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:13.317545    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:13.317558    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:13.317573    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:13.317586    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:13.317597    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:13.317604    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:13.317610    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:13.317616    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:13.317624    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:13.317632    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:13.317640    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:13.317648    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:13.317663    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:13.317673    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:13.317691    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:13.317706    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:13.317714    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:13.317723    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:13.317732    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:15.317678    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 21
	I0731 10:51:15.317694    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:15.317807    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:15.318647    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:15.318719    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:15.318737    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:15.318750    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:15.318759    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:15.318765    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:15.318772    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:15.318778    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:15.318784    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:15.318803    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:15.318817    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:15.318825    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:15.318832    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:15.318856    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:15.318867    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:15.318875    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:15.318883    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:15.318890    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:15.318899    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:15.318924    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:15.318937    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:15.318945    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:15.318954    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:15.318961    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:15.318969    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:17.320868    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 22
	I0731 10:51:17.320882    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:17.320963    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:17.321835    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:17.321891    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:17.321906    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:17.321921    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:17.321930    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:17.321937    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:17.321944    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:17.321950    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:17.321956    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:17.321963    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:17.321969    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:17.321975    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:17.321983    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:17.321990    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:17.321996    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:17.322010    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:17.322018    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:17.322026    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:17.322032    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:17.322038    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:17.322045    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:17.322051    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:17.322058    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:17.322065    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:17.322073    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:19.324052    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 23
	I0731 10:51:19.324067    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:19.324116    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:19.324963    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:19.325023    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:19.325034    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:19.325042    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:19.325063    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:19.325076    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:19.325086    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:19.325101    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:19.325115    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:19.325130    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:19.325141    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:19.325160    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:19.325169    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:19.325176    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:19.325182    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:19.325188    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:19.325195    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:19.325204    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:19.325211    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:19.325220    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:19.325227    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:19.325235    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:19.325252    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:19.325262    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:19.325274    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:21.326043    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 24
	I0731 10:51:21.326058    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:21.326160    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:21.326984    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:21.327086    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:21.327094    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:21.327101    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:21.327106    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:21.327122    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:21.327129    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:21.327137    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:21.327143    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:21.327155    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:21.327163    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:21.327170    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:21.327178    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:21.327185    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:21.327193    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:21.327199    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:21.327222    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:21.327228    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:21.327234    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:21.327253    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:21.327259    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:21.327265    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:21.327272    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:21.327277    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:21.327285    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:23.328641    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 25
	I0731 10:51:23.328656    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:23.328690    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:23.329542    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:23.329611    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:23.329621    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:23.329634    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:23.329643    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:23.329652    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:23.329661    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:23.329668    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:23.329674    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:23.329689    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:23.329707    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:23.329716    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:23.329722    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:23.329732    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:23.329744    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:23.329751    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:23.329760    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:23.329774    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:23.329787    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:23.329795    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:23.329803    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:23.329810    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:23.329819    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:23.329829    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:23.329837    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:25.329754    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 26
	I0731 10:51:25.329769    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:25.329848    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:25.330680    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:25.330721    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:25.330730    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:25.330750    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:25.330761    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:25.330770    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:25.330782    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:25.330799    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:25.330807    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:25.330813    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:25.330821    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:25.330835    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:25.330849    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:25.330858    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:25.330865    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:25.330879    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:25.330889    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:25.330896    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:25.330904    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:25.330911    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:25.330919    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:25.330926    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:25.330933    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:25.330958    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:25.330972    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:27.332376    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 27
	I0731 10:51:27.332391    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:27.332468    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:27.333336    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:27.333398    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:27.333413    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:27.333435    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:27.333444    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:27.333453    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:27.333462    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:27.333469    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:27.333476    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:27.333484    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:27.333491    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:27.333498    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:27.333516    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:27.333524    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:27.333532    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:27.333539    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:27.333551    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:27.333562    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:27.333570    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:27.333579    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:27.333587    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:27.333605    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:27.333612    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:27.333624    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:27.333631    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:29.335550    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 28
	I0731 10:51:29.335566    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:29.335653    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:29.336463    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:29.336515    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:29.336534    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:29.336561    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:29.336570    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:29.336578    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:29.336585    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:29.336592    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:29.336598    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:29.336609    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:29.336617    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:29.336624    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:29.336630    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:29.336636    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:29.336649    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:29.336657    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:29.336671    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:29.336681    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:29.336690    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:29.336700    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:29.336713    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:29.336722    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:29.336729    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:29.336737    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:29.336745    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:31.338716    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 29
	I0731 10:51:31.338733    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:31.338784    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:31.339709    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for e:f7:a8:21:98:73 in /var/db/dhcpd_leases ...
	I0731 10:51:31.339758    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 23 entries in /var/db/dhcpd_leases!
	I0731 10:51:31.339770    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:31.339792    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:31.339800    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:31.339807    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:31.339828    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:31.339844    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:31.339857    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:31.339866    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:31.339875    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:31.339884    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:31.339892    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:31.339901    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:31.339909    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:31.339917    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:31.339925    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:31.339940    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:31.339954    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:31.339962    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:31.339969    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:31.339977    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:31.339984    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:31.339992    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:31.340008    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:33.342000    6097 client.go:171] duration metric: took 1m0.721691237s to LocalClient.Create
	I0731 10:51:35.343263    6097 start.go:128] duration metric: took 1m2.777203649s to createHost
	I0731 10:51:35.343280    6097 start.go:83] releasing machines lock for "force-systemd-flag-854000", held for 1m2.777364485s
	W0731 10:51:35.343296    6097 start.go:714] error starting host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for e:f7:a8:21:98:73
	I0731 10:51:35.343710    6097 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:51:35.343736    6097 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:51:35.352315    6097 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54560
	I0731 10:51:35.352761    6097 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:51:35.353105    6097 main.go:141] libmachine: Using API Version  1
	I0731 10:51:35.353114    6097 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:51:35.353359    6097 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:51:35.353747    6097 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:51:35.353770    6097 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:51:35.362127    6097 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54562
	I0731 10:51:35.362455    6097 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:51:35.362775    6097 main.go:141] libmachine: Using API Version  1
	I0731 10:51:35.362785    6097 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:51:35.362994    6097 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:51:35.363108    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .GetState
	I0731 10:51:35.363199    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:35.363268    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:35.364251    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .DriverName
	I0731 10:51:35.427582    6097 out.go:177] * Deleting "force-systemd-flag-854000" in hyperkit ...
	I0731 10:51:35.448890    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .Remove
	I0731 10:51:35.449012    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:35.449022    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:35.449121    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:35.450093    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:35.450138    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | waiting for graceful shutdown
	I0731 10:51:36.452276    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:36.452308    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:36.453372    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | waiting for graceful shutdown
	I0731 10:51:37.453979    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:37.454053    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:37.455800    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | waiting for graceful shutdown
	I0731 10:51:38.456379    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:38.456455    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:38.457177    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | waiting for graceful shutdown
	I0731 10:51:39.458525    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:39.458577    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:39.459151    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | waiting for graceful shutdown
	I0731 10:51:40.459470    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:40.459569    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6107
	I0731 10:51:40.460670    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | sending sigkill
	I0731 10:51:40.460679    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:40.471613    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:40 WARN : hyperkit: failed to read stdout: EOF
	I0731 10:51:40.471636    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:40 WARN : hyperkit: failed to read stderr: EOF
	W0731 10:51:40.491029    6097 out.go:239] ! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for e:f7:a8:21:98:73
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for e:f7:a8:21:98:73
	I0731 10:51:40.491045    6097 start.go:729] Will try again in 5 seconds ...
	I0731 10:51:45.492586    6097 start.go:360] acquireMachinesLock for force-systemd-flag-854000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:51:53.414847    6097 start.go:364] duration metric: took 7.92239339s to acquireMachinesLock for "force-systemd-flag-854000"
	I0731 10:51:53.414878    6097 start.go:93] Provisioning new machine with config: &{Name:force-systemd-flag-854000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kuberne
tesConfig:{KubernetesVersion:v1.30.3 ClusterName:force-systemd-flag-854000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: Disable
Optimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:51:53.414955    6097 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 10:51:53.439221    6097 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0731 10:51:53.439326    6097 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:51:53.439364    6097 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:51:53.448516    6097 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54588
	I0731 10:51:53.448853    6097 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:51:53.449186    6097 main.go:141] libmachine: Using API Version  1
	I0731 10:51:53.449197    6097 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:51:53.449395    6097 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:51:53.449490    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .GetMachineName
	I0731 10:51:53.449569    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .DriverName
	I0731 10:51:53.449676    6097 start.go:159] libmachine.API.Create for "force-systemd-flag-854000" (driver="hyperkit")
	I0731 10:51:53.449692    6097 client.go:168] LocalClient.Create starting
	I0731 10:51:53.449723    6097 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 10:51:53.449764    6097 main.go:141] libmachine: Decoding PEM data...
	I0731 10:51:53.449778    6097 main.go:141] libmachine: Parsing certificate...
	I0731 10:51:53.449834    6097 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 10:51:53.449863    6097 main.go:141] libmachine: Decoding PEM data...
	I0731 10:51:53.449875    6097 main.go:141] libmachine: Parsing certificate...
	I0731 10:51:53.449894    6097 main.go:141] libmachine: Running pre-create checks...
	I0731 10:51:53.449900    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .PreCreateCheck
	I0731 10:51:53.449982    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:53.450016    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .GetConfigRaw
	I0731 10:51:53.459525    6097 main.go:141] libmachine: Creating machine...
	I0731 10:51:53.459533    6097 main.go:141] libmachine: (force-systemd-flag-854000) Calling .Create
	I0731 10:51:53.459611    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:53.459745    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | I0731 10:51:53.459609    6137 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:51:53.459798    6097 main.go:141] libmachine: (force-systemd-flag-854000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 10:51:53.685819    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | I0731 10:51:53.685751    6137 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/id_rsa...
	I0731 10:51:53.752164    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | I0731 10:51:53.752081    6137 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/force-systemd-flag-854000.rawdisk...
	I0731 10:51:53.752177    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Writing magic tar header
	I0731 10:51:53.752190    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Writing SSH key tar header
	I0731 10:51:53.752527    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | I0731 10:51:53.752496    6137 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000 ...
	I0731 10:51:54.132071    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:54.132108    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/hyperkit.pid
	I0731 10:51:54.132124    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Using UUID f68b9fd6-ce6b-4a49-9c19-780c8771a33d
	I0731 10:51:54.156517    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Generated MAC 52:54:3d:eb:40:30
	I0731 10:51:54.156537    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-854000
	I0731 10:51:54.156573    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"f68b9fd6-ce6b-4a49-9c19-780c8771a33d", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001e0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:
[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:51:54.156609    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"f68b9fd6-ce6b-4a49-9c19-780c8771a33d", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001e0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:
[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:51:54.156692    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "f68b9fd6-ce6b-4a49-9c19-780c8771a33d", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/force-systemd-flag-854000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/fo
rce-systemd-flag-854000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-854000"}
	I0731 10:51:54.156734    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U f68b9fd6-ce6b-4a49-9c19-780c8771a33d -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/force-systemd-flag-854000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/bzimage,/Users/jenkins/minikube-integr
ation/19349-1046/.minikube/machines/force-systemd-flag-854000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-flag-854000"
	I0731 10:51:54.156783    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:51:54.159863    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 DEBUG: hyperkit: Pid is 6138
	I0731 10:51:54.161164    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 0
	I0731 10:51:54.161184    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:54.161278    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:51:54.162408    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:51:54.162497    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:51:54.162542    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb32}
	I0731 10:51:54.162560    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:54.162575    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:54.162584    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:54.162594    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:54.162608    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:54.162634    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:54.162677    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:54.162712    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:54.162723    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:54.162743    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:54.162754    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:54.162776    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:54.162793    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:54.162807    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:54.162821    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:54.162832    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:54.162854    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:54.162869    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:54.162891    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:54.162899    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:54.162907    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:54.162915    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:54.162925    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:54.167749    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:51:54.175938    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-flag-854000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:51:54.177015    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:51:54.177043    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:51:54.177058    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:51:54.177071    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:51:54.556246    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:51:54.556273    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:51:54.671039    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:51:54.671060    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:51:54.671115    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:51:54.671136    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:51:54.671915    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:51:54.671927    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:51:54 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:51:56.163316    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 1
	I0731 10:51:56.163333    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:56.163395    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:51:56.164214    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:51:56.164284    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:51:56.164300    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb32}
	I0731 10:51:56.164309    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:56.164315    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:56.164322    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:56.164327    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:56.164335    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:56.164348    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:56.164356    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:56.164362    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:56.164368    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:56.164375    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:56.164383    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:56.164396    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:56.164410    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:56.164418    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:56.164427    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:56.164434    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:56.164442    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:56.164455    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:56.164463    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:56.164471    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:56.164479    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:56.164493    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:56.164507    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:51:58.165503    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 2
	I0731 10:51:58.165522    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:51:58.165673    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:51:58.166576    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:51:58.166636    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:51:58.166647    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb32}
	I0731 10:51:58.166658    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:51:58.166664    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:51:58.166671    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:51:58.166678    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:51:58.166699    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:51:58.166709    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:51:58.166716    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:51:58.166722    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:51:58.166746    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:51:58.166763    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:51:58.166771    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:51:58.166779    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:51:58.166791    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:51:58.166801    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:51:58.166808    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:51:58.166821    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:51:58.166828    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:51:58.166836    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:51:58.166843    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:51:58.166877    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:51:58.166887    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:51:58.166897    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:51:58.166905    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:00.070475    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:52:00 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 10:52:00.070606    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:52:00 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 10:52:00.070615    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:52:00 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 10:52:00.090614    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | 2024/07/31 10:52:00 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 10:52:00.166799    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 3
	I0731 10:52:00.166812    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:00.166886    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:00.167701    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:00.167735    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:00.167742    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:00.167754    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:00.167762    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:00.167768    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:00.167785    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:00.167793    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:00.167803    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:00.167826    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:00.167841    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:00.167849    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:00.167857    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:00.167871    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:00.167882    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:00.167898    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:00.167911    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:00.167919    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:00.167927    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:00.167936    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:00.167945    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:00.167951    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:00.167959    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:00.167965    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:00.167973    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:00.167987    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:02.169106    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 4
	I0731 10:52:02.169136    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:02.169201    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:02.170060    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:02.170129    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:02.170139    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:02.170165    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:02.170188    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:02.170203    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:02.170214    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:02.170223    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:02.170230    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:02.170238    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:02.170252    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:02.170262    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:02.170270    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:02.170278    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:02.170284    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:02.170291    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:02.170299    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:02.170307    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:02.170315    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:02.170337    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:02.170350    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:02.170357    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:02.170365    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:02.170372    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:02.170383    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:02.170392    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:04.171108    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 5
	I0731 10:52:04.171147    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:04.171230    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:04.172064    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:04.172127    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:04.172139    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:04.172150    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:04.172156    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:04.172170    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:04.172181    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:04.172189    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:04.172196    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:04.172203    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:04.172210    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:04.172219    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:04.172225    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:04.172232    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:04.172238    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:04.172247    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:04.172255    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:04.172262    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:04.172269    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:04.172277    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:04.172285    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:04.172292    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:04.172312    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:04.172324    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:04.172337    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:04.172346    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:06.172774    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 6
	I0731 10:52:06.172789    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:06.172856    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:06.173739    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:06.173789    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:06.173804    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:06.173834    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:06.173858    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:06.173868    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:06.173879    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:06.173888    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:06.173896    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:06.173905    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:06.173912    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:06.173919    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:06.173926    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:06.173932    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:06.173943    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:06.173952    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:06.173967    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:06.173982    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:06.173990    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:06.173998    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:06.174006    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:06.174013    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:06.174021    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:06.174036    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:06.174044    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:06.174052    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:08.175928    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 7
	I0731 10:52:08.175942    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:08.176063    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:08.176896    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:08.176949    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:08.176959    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:08.176977    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:08.176984    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:08.176991    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:08.176999    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:08.177006    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:08.177016    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:08.177025    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:08.177032    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:08.177039    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:08.177046    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:08.177054    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:08.177061    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:08.177069    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:08.177076    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:08.177083    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:08.177091    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:08.177099    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:08.177105    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:08.177113    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:08.177120    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:08.177127    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:08.177136    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:08.177143    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:10.177138    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 8
	I0731 10:52:10.177152    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:10.177187    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:10.178085    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:10.178131    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:10.178139    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:10.178150    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:10.178170    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:10.178178    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:10.178187    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:10.178197    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:10.178206    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:10.178211    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:10.178219    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:10.178227    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:10.178244    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:10.178258    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:10.178268    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:10.178280    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:10.178291    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:10.178300    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:10.178306    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:10.178315    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:10.178328    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:10.178335    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:10.178343    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:10.178353    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:10.178361    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:10.178370    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:12.178604    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 9
	I0731 10:52:12.178619    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:12.178690    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:12.179559    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:12.179613    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:12.179624    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:12.179632    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:12.179640    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:12.179648    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:12.179654    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:12.179660    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:12.179668    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:12.179678    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:12.179685    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:12.179690    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:12.179706    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:12.179713    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:12.179721    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:12.179728    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:12.179743    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:12.179755    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:12.179766    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:12.179774    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:12.179781    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:12.179788    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:12.179802    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:12.179814    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:12.179821    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:12.179827    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:14.180237    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 10
	I0731 10:52:14.180251    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:14.180370    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:14.181197    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:14.181270    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:14.181308    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:14.181319    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:14.181330    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:14.181348    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:14.181363    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:14.181371    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:14.181378    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:14.181384    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:14.181391    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:14.181410    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:14.181425    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:14.181437    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:14.181454    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:14.181466    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:14.181476    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:14.181484    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:14.181492    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:14.181508    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:14.181515    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:14.181528    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:14.181542    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:14.181554    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:14.181569    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:14.181581    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:16.182355    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 11
	I0731 10:52:16.182373    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:16.182515    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:16.183331    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:16.183373    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:16.183382    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:16.183404    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:16.183412    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:16.183419    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:16.183426    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:16.183436    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:16.183447    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:16.183454    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:16.183460    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:16.183467    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:16.183478    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:16.183486    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:16.183500    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:16.183508    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:16.183514    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:16.183521    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:16.183532    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:16.183539    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:16.183546    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:16.183561    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:16.183574    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:16.183584    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:16.183592    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:16.183608    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:18.185198    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 12
	I0731 10:52:18.185213    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:18.185284    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:18.186175    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:18.186201    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:18.186214    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:18.186225    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:18.186233    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:18.186258    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:18.186270    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:18.186280    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:18.186285    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:18.186291    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:18.186299    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:18.186316    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:18.186327    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:18.186337    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:18.186345    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:18.186353    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:18.186368    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:18.186380    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:18.186399    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:18.186410    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:18.186419    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:18.186427    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:18.186434    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:18.186441    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:18.186448    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:18.186455    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:20.187603    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 13
	I0731 10:52:20.187618    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:20.187698    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:20.188549    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:20.188598    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:20.188607    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:20.188617    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:20.188629    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:20.188638    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:20.188644    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:20.188661    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:20.188672    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:20.188681    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:20.188687    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:20.188694    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:20.188702    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:20.188710    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:20.188716    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:20.188723    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:20.188729    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:20.188738    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:20.188748    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:20.188756    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:20.188764    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:20.188772    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:20.188780    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:20.188794    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:20.188807    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:20.188816    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:22.190077    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 14
	I0731 10:52:22.190124    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:22.190188    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:22.191059    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:22.191114    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:22.191127    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:22.191155    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:22.191175    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:22.191190    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:22.191201    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:22.191209    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:22.191216    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:22.191228    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:22.191241    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:22.191249    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:22.191255    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:22.191271    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:22.191281    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:22.191293    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:22.191302    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:22.191309    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:22.191317    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:22.191331    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:22.191351    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:22.191360    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:22.191368    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:22.191375    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:22.191384    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:22.191395    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:24.191799    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 15
	I0731 10:52:24.191816    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:24.191875    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:24.192694    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:24.192757    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:24.192767    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:24.192774    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:24.192781    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:24.192806    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:24.192817    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:24.192826    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:24.192834    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:24.192841    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:24.192850    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:24.192867    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:24.192879    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:24.192886    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:24.192894    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:24.192910    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:24.192919    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:24.192928    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:24.192936    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:24.192954    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:24.192963    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:24.192971    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:24.192978    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:24.193003    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:24.193017    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:24.193028    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:26.193639    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 16
	I0731 10:52:26.193654    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:26.193797    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:26.194664    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:26.194724    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:26.194734    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:26.194762    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:26.194773    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:26.194780    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:26.194787    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:26.194794    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:26.194803    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:26.194815    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:26.194823    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:26.194831    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:26.194839    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:26.194857    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:26.194870    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:26.194879    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:26.194887    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:26.194894    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:26.194902    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:26.194909    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:26.194917    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:26.194924    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:26.194930    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:26.194935    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:26.194942    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:26.194950    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:28.195355    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 17
	I0731 10:52:28.195379    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:28.195478    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:28.196326    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:28.196348    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:28.196359    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:28.196368    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:28.196375    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:28.196383    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:28.196392    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:28.196398    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:28.196407    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:28.196425    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:28.196436    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:28.196444    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:28.196454    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:28.196462    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:28.196470    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:28.196481    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:28.196490    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:28.196497    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:28.196505    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:28.196519    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:28.196536    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:28.196544    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:28.196556    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:28.196564    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:28.196572    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:28.196580    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:30.197465    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 18
	I0731 10:52:30.197479    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:30.197557    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:30.198394    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:30.198456    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:30.198465    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:30.198473    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:30.198479    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:30.198487    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:30.198493    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:30.198506    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:30.198544    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:30.198553    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:30.198565    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:30.198574    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:30.198581    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:30.198588    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:30.198597    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:30.198605    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:30.198619    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:30.198634    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:30.198651    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:30.198664    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:30.198678    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:30.198687    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:30.198694    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:30.198701    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:30.198709    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:30.198723    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:32.200588    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 19
	I0731 10:52:32.200604    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:32.200675    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:32.201514    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:32.201563    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:32.201574    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:32.201585    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:32.201592    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:32.201599    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:32.201605    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:32.201612    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:32.201619    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:32.201626    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:32.201632    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:32.201638    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:32.201645    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:32.201655    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:32.201671    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:32.201695    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:32.201705    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:32.201713    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:32.201721    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:32.201728    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:32.201736    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:32.201743    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:32.201751    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:32.201758    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:32.201765    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:32.201773    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:34.202982    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 20
	I0731 10:52:34.203000    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:34.203111    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:34.204025    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:34.204067    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:34.204077    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:34.204086    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:34.204097    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:34.204119    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:34.204131    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:34.204139    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:34.204146    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:34.204166    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:34.204184    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:34.204194    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:34.204202    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:34.204208    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:34.204221    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:34.204232    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:34.204240    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:34.204248    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:34.204255    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:34.204263    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:34.204270    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:34.204278    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:34.204285    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:34.204292    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:34.204302    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:34.204308    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:36.206211    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 21
	I0731 10:52:36.206230    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:36.206272    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:36.207206    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:36.207272    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:36.207284    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:36.207292    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:36.207298    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:36.207307    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:36.207313    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:36.207320    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:36.207326    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:36.207333    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:36.207339    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:36.207345    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:36.207359    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:36.207368    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:36.207375    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:36.207383    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:36.207399    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:36.207408    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:36.207415    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:36.207423    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:36.207431    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:36.207438    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:36.207445    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:36.207453    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:36.207462    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:36.207472    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:38.208717    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 22
	I0731 10:52:38.208733    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:38.208792    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:38.209855    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:38.209895    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:38.209902    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:38.209909    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:38.209923    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:38.209933    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:38.209940    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:38.209949    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:38.209958    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:38.209965    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:38.209978    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:38.209985    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:38.209991    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:38.210000    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:38.210008    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:38.210024    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:38.210037    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:38.210045    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:38.210051    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:38.210060    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:38.210068    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:38.210076    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:38.210081    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:38.210100    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:38.210115    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:38.210131    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:40.210091    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 23
	I0731 10:52:40.210106    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:40.210180    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:40.211029    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:40.211084    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:40.211096    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:40.211109    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:40.211117    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:40.211124    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:40.211130    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:40.211136    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:40.211142    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:40.211149    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:40.211155    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:40.211171    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:40.211184    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:40.211194    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:40.211202    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:40.211212    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:40.211220    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:40.211228    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:40.211239    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:40.211249    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:40.211255    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:40.211263    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:40.211276    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:40.211289    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:40.211297    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:40.211305    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:42.212227    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 24
	I0731 10:52:42.212240    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:42.212303    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:42.213395    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:42.213434    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:42.213444    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:42.213452    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:42.213460    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:42.213471    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:42.213477    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:42.213496    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:42.213509    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:42.213517    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:42.213526    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:42.213533    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:42.213540    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:42.213556    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:42.213564    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:42.213572    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:42.213580    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:42.213587    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:42.213594    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:42.213602    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:42.213612    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:42.213621    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:42.213629    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:42.213636    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:42.213644    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:42.213653    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:44.214398    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 25
	I0731 10:52:44.214413    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:44.214513    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:44.215592    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:44.215649    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:44.215661    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:44.215676    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:44.215683    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:44.215691    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:44.215696    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:44.215703    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:44.215709    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:44.215716    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:44.215722    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:44.215755    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:44.215774    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:44.215782    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:44.215791    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:44.215801    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:44.215810    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:44.215817    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:44.215825    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:44.215832    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:44.215839    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:44.215846    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:44.215858    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:44.215868    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:44.215877    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:44.215886    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:46.217808    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 26
	I0731 10:52:46.217824    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:46.217965    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:46.219107    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:46.219182    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:46.219192    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:46.219200    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:46.219206    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:46.219213    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:46.219220    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:46.219227    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:46.219234    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:46.219244    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:46.219250    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:46.219270    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:46.219282    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:46.219291    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:46.219299    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:46.219306    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:46.219314    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:46.219321    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:46.219329    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:46.219336    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:46.219344    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:46.219351    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:46.219359    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:46.219368    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:46.219376    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:46.219385    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:48.219417    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 27
	I0731 10:52:48.219431    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:48.219504    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:48.220485    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:48.220564    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:48.220574    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:48.220582    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:48.220588    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:48.220608    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:48.220619    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:48.220627    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:48.220635    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:48.220651    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:48.220657    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:48.220666    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:48.220675    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:48.220685    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:48.220692    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:48.220710    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:48.220726    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:48.220735    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:48.220744    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:48.220752    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:48.220761    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:48.220768    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:48.220775    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:48.220792    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:48.220804    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:48.220814    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:50.222127    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 28
	I0731 10:52:50.222159    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:50.222240    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:50.223332    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:50.223391    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:50.223403    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:50.223420    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:50.223430    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:50.223438    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:50.223445    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:50.223451    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:50.223461    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:50.223469    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:50.223474    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:50.223495    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:50.223509    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:50.223518    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:50.223524    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:50.223530    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:50.223545    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:50.223558    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:50.223570    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:50.223578    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:50.223586    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:50.223594    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:50.223601    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:50.223608    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:50.223619    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:50.223627    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:52.224804    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Attempt 29
	I0731 10:52:52.224817    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:52:52.224893    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | hyperkit pid from json: 6138
	I0731 10:52:52.225748    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Searching for 52:54:3d:eb:40:30 in /var/db/dhcpd_leases ...
	I0731 10:52:52.225784    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:52:52.225794    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66aa79be}
	I0731 10:52:52.225804    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:52:52.225830    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:52:52.225844    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:52:52.225859    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:52:52.225871    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:52:52.225878    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:52:52.225889    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:52:52.225896    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:52:52.225906    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:52:52.225913    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:52:52.225921    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:52:52.225929    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:52:52.225943    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:52:52.225954    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:52:52.225962    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:52:52.225969    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:52:52.225977    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:52:52.225985    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:52:52.225993    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:52:52.226005    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:52:52.226013    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:52:52.226021    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:52:52.226030    6097 main.go:141] libmachine: (force-systemd-flag-854000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:52:54.225993    6097 client.go:171] duration metric: took 1m0.777559892s to LocalClient.Create
	I0731 10:52:56.228094    6097 start.go:128] duration metric: took 1m2.814440217s to createHost
	I0731 10:52:56.228106    6097 start.go:83] releasing machines lock for "force-systemd-flag-854000", held for 1m2.814553805s
	W0731 10:52:56.228177    6097 out.go:239] * Failed to start hyperkit VM. Running "minikube delete -p force-systemd-flag-854000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 52:54:3d:eb:40:30
	* Failed to start hyperkit VM. Running "minikube delete -p force-systemd-flag-854000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 52:54:3d:eb:40:30
	I0731 10:52:56.270575    6097 out.go:177] 
	W0731 10:52:56.312567    6097 out.go:239] X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 52:54:3d:eb:40:30
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 52:54:3d:eb:40:30
	W0731 10:52:56.312579    6097 out.go:239] * 
	* 
	W0731 10:52:56.313202    6097 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:52:56.375593    6097 out.go:177] 

                                                
                                                
** /stderr **
docker_test.go:93: failed to start minikube with args: "out/minikube-darwin-amd64 start -p force-systemd-flag-854000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit " : exit status 80
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-854000 ssh "docker info --format {{.CgroupDriver}}"
docker_test.go:110: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p force-systemd-flag-854000 ssh "docker info --format {{.CgroupDriver}}": exit status 50 (167.624665ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node force-systemd-flag-854000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
docker_test.go:112: failed to get docker cgroup driver. args "out/minikube-darwin-amd64 -p force-systemd-flag-854000 ssh \"docker info --format {{.CgroupDriver}}\"": exit status 50
docker_test.go:106: *** TestForceSystemdFlag FAILED at 2024-07-31 10:52:56.684189 -0700 PDT m=+4421.777140247
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-flag-854000 -n force-systemd-flag-854000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-flag-854000 -n force-systemd-flag-854000: exit status 7 (79.33914ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 10:52:56.761542    6186 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 10:52:56.761565    6186 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "force-systemd-flag-854000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:175: Cleaning up "force-systemd-flag-854000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-854000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-854000: (5.234822075s)
--- FAIL: TestForceSystemdFlag (149.88s)

                                                
                                    
x
+
TestForceSystemdEnv (201.73s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-141000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 
E0731 10:53:19.405577    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:53:27.174562    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
docker_test.go:155: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p force-systemd-env-141000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : exit status 80 (3m16.102688979s)

                                                
                                                
-- stdout --
	* [force-systemd-env-141000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=true
	* Using the hyperkit driver based on user configuration
	* Downloading driver docker-machine-driver-hyperkit:
	* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:
	
	    $ sudo chown root:wheel /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit 
	    $ sudo chmod u+s /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit 
	
	
	* Starting "force-systemd-env-141000" primary control-plane node in "force-systemd-env-141000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "force-systemd-env-141000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:53:15.614393    6225 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:53:15.614660    6225 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:53:15.614666    6225 out.go:304] Setting ErrFile to fd 2...
	I0731 10:53:15.614670    6225 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:53:15.614858    6225 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:53:15.616281    6225 out.go:298] Setting JSON to false
	I0731 10:53:15.639824    6225 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":4965,"bootTime":1722443430,"procs":463,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:53:15.639911    6225 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:53:15.664827    6225 out.go:177] * [force-systemd-env-141000] minikube v1.33.1 on Darwin 14.5
	I0731 10:53:15.706825    6225 notify.go:220] Checking for updates...
	I0731 10:53:15.729513    6225 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:53:15.771603    6225 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:53:15.814445    6225 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:53:15.835591    6225 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:53:15.877514    6225 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:53:15.919514    6225 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=true
	I0731 10:53:15.941031    6225 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:53:15.971452    6225 out.go:177] * Using the hyperkit driver based on user configuration
	I0731 10:53:16.013473    6225 start.go:297] selected driver: hyperkit
	I0731 10:53:16.013501    6225 start.go:901] validating driver "hyperkit" against <nil>
	I0731 10:53:16.013520    6225 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:53:16.018099    6225 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:53:18.350695    6225 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/testdata/hyperkit-driver-without-version:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	W0731 10:53:18.364107    6225 install.go:62] docker-machine-driver-hyperkit: exit status 1
	I0731 10:53:18.385768    6225 out.go:177] * Downloading driver docker-machine-driver-hyperkit:
	I0731 10:53:18.428099    6225 download.go:107] Downloading: https://github.com/kubernetes/minikube/releases/download/v1.33.1/docker-machine-driver-hyperkit-amd64?checksum=file:https://github.com/kubernetes/minikube/releases/download/v1.33.1/docker-machine-driver-hyperkit-amd64.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:53:18.832507    6225 driver.go:46] failed to download arch specific driver: getter: &{Ctx:context.Background Src:https://github.com/kubernetes/minikube/releases/download/v1.33.1/docker-machine-driver-hyperkit-amd64?checksum=file:https://github.com/kubernetes/minikube/releases/download/v1.33.1/docker-machine-driver-hyperkit-amd64.sha256 Dst:/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit.download Pwd: Mode:2 Umask:---------- Detectors:[0x63b7960 0x63b7960 0x63b7960 0x63b7960 0x63b7960 0x63b7960 0x63b7960] Decompressors:map[bz2:0xc000697650 gz:0xc000697658 tar:0xc000697600 tar.bz2:0xc000697610 tar.gz:0xc000697620 tar.xz:0xc000697630 tar.zst:0xc000697640 tbz2:0xc000697610 tgz:0xc000697620 txz:0xc000697630 tzst:0xc000697640 xz:0xc000697660 zip:0xc000697670 zst:0xc000697668] Getters:map[file:0xc0001ac200 http:0xc000710af0 https:0xc000710b40] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: invalid checksum: Error downlo
ading checksum file: bad response code: 404. trying to get the common version
	I0731 10:53:18.832547    6225 download.go:107] Downloading: https://github.com/kubernetes/minikube/releases/download/v1.33.1/docker-machine-driver-hyperkit?checksum=file:https://github.com/kubernetes/minikube/releases/download/v1.33.1/docker-machine-driver-hyperkit.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:53:21.299112    6225 install.go:79] stdout: 
	I0731 10:53:21.336407    6225 out.go:177] * The 'hyperkit' driver requires elevated permissions. The following commands will be executed:
	
	    $ sudo chown root:wheel /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit 
	    $ sudo chmod u+s /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit 
	
	
	I0731 10:53:21.356949    6225 install.go:99] testing: [sudo -n chown root:wheel /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit]
	I0731 10:53:21.373307    6225 install.go:106] running: [sudo chown root:wheel /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit]
	I0731 10:53:21.388327    6225 install.go:99] testing: [sudo -n chmod u+s /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit]
	I0731 10:53:21.402278    6225 install.go:106] running: [sudo chmod u+s /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit]
	I0731 10:53:21.416189    6225 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 10:53:21.416437    6225 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0731 10:53:21.416485    6225 cni.go:84] Creating CNI manager for ""
	I0731 10:53:21.416501    6225 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0731 10:53:21.416508    6225 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0731 10:53:21.416582    6225 start.go:340] cluster config:
	{Name:force-systemd-env-141000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:force-systemd-env-141000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluste
r.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:53:21.416691    6225 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:53:21.458827    6225 out.go:177] * Starting "force-systemd-env-141000" primary control-plane node in "force-systemd-env-141000" cluster
	I0731 10:53:21.480080    6225 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:53:21.480129    6225 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:53:21.480145    6225 cache.go:56] Caching tarball of preloaded images
	I0731 10:53:21.480270    6225 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:53:21.480280    6225 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:53:21.480556    6225 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/force-systemd-env-141000/config.json ...
	I0731 10:53:21.480576    6225 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/force-systemd-env-141000/config.json: {Name:mk0173e39a04f53a3b2b07c7bb2ca92aec529ea6 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:53:21.480913    6225 start.go:360] acquireMachinesLock for force-systemd-env-141000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:53:21.480978    6225 start.go:364] duration metric: took 49.723µs to acquireMachinesLock for "force-systemd-env-141000"
	I0731 10:53:21.481006    6225 start.go:93] Provisioning new machine with config: &{Name:force-systemd-env-141000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.30.3 ClusterName:force-systemd-env-141000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOp
timizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:53:21.481050    6225 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 10:53:21.502123    6225 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0731 10:53:21.502292    6225 main.go:141] libmachine: Found binary path at /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:53:21.502328    6225 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:53:22.569734    6225 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54706
	I0731 10:53:22.570133    6225 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:53:22.570554    6225 main.go:141] libmachine: Using API Version  1
	I0731 10:53:22.570564    6225 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:53:22.570824    6225 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:53:22.570952    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .GetMachineName
	I0731 10:53:22.571079    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .DriverName
	I0731 10:53:22.571194    6225 start.go:159] libmachine.API.Create for "force-systemd-env-141000" (driver="hyperkit")
	I0731 10:53:22.571223    6225 client.go:168] LocalClient.Create starting
	I0731 10:53:22.571254    6225 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 10:53:22.571307    6225 main.go:141] libmachine: Decoding PEM data...
	I0731 10:53:22.571330    6225 main.go:141] libmachine: Parsing certificate...
	I0731 10:53:22.571391    6225 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 10:53:22.571429    6225 main.go:141] libmachine: Decoding PEM data...
	I0731 10:53:22.571441    6225 main.go:141] libmachine: Parsing certificate...
	I0731 10:53:22.571460    6225 main.go:141] libmachine: Running pre-create checks...
	I0731 10:53:22.571473    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .PreCreateCheck
	I0731 10:53:22.571575    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:22.571737    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .GetConfigRaw
	I0731 10:53:22.572218    6225 main.go:141] libmachine: Creating machine...
	I0731 10:53:22.572229    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .Create
	I0731 10:53:22.572301    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:22.572423    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | I0731 10:53:22.572300    6253 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:53:22.572505    6225 main.go:141] libmachine: (force-systemd-env-141000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 10:53:22.757860    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | I0731 10:53:22.757751    6253 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/id_rsa...
	I0731 10:53:22.876344    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | I0731 10:53:22.876245    6253 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/force-systemd-env-141000.rawdisk...
	I0731 10:53:22.876358    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Writing magic tar header
	I0731 10:53:22.876371    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Writing SSH key tar header
	I0731 10:53:22.877132    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | I0731 10:53:22.877044    6253 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000 ...
	I0731 10:53:23.254095    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:23.254115    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/hyperkit.pid
	I0731 10:53:23.254126    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Using UUID ad549f6c-62d4-4e05-874c-cc739f839104
	I0731 10:53:23.284768    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Generated MAC 42:63:10:65:61:9
	I0731 10:53:23.284784    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-141000
	I0731 10:53:23.284811    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"ad549f6c-62d4-4e05-874c-cc739f839104", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae660)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]str
ing(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:53:23.284843    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"ad549f6c-62d4-4e05-874c-cc739f839104", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ae660)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]str
ing(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:53:23.284882    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "ad549f6c-62d4-4e05-874c-cc739f839104", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/force-systemd-env-141000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-sys
temd-env-141000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-141000"}
	I0731 10:53:23.284920    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U ad549f6c-62d4-4e05-874c-cc739f839104 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/force-systemd-env-141000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/bzimage,/Users/jenkins/minikube-integration/19
349-1046/.minikube/machines/force-systemd-env-141000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-141000"
	I0731 10:53:23.284935    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:53:23.288033    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 DEBUG: hyperkit: Pid is 6254
	I0731 10:53:23.289083    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 0
	I0731 10:53:23.289095    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:23.289160    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:23.290112    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:23.290208    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:23.290223    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:23.290231    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:23.290241    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:23.290266    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:23.290283    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:23.290314    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:23.290331    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:23.290347    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:23.290357    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:23.290369    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:23.290383    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:23.290396    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:23.290417    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:23.290432    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:23.290439    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:23.290446    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:23.290470    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:23.290483    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:23.290495    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:23.290521    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:23.290533    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:23.290539    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:23.290569    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:23.290581    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:23.296019    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:53:23.305738    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:53:23.306743    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:53:23.306770    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:53:23.306800    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:53:23.306826    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:53:23.683238    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:53:23.683254    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:53:23.797771    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:53:23.797800    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:53:23.797812    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:53:23.797819    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:53:23.798715    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:53:23.798725    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:53:25.291295    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 1
	I0731 10:53:25.291311    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:25.291410    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:25.292235    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:25.292302    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:25.292311    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:25.292320    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:25.292341    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:25.292349    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:25.292357    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:25.292365    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:25.292374    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:25.292381    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:25.292388    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:25.292394    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:25.292403    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:25.292419    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:25.292431    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:25.292446    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:25.292455    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:25.292463    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:25.292470    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:25.292486    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:25.292506    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:25.292521    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:25.292535    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:25.292547    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:25.292555    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:25.292564    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:27.293854    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 2
	I0731 10:53:27.293868    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:27.293934    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:27.294770    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:27.294797    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:27.294806    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:27.294815    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:27.294822    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:27.294842    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:27.294850    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:27.294858    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:27.294865    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:27.294872    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:27.294879    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:27.294887    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:27.294893    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:27.294899    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:27.294906    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:27.294925    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:27.294937    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:27.294945    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:27.294953    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:27.294961    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:27.294967    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:27.294977    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:27.294985    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:27.294991    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:27.295005    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:27.295023    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:29.190419    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:29 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:53:29.190537    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:29 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:53:29.190549    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:29 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:53:29.210347    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:53:29 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:53:29.295579    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 3
	I0731 10:53:29.295591    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:29.295685    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:29.296482    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:29.296542    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:29.296559    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:29.296583    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:29.296611    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:29.296619    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:29.296626    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:29.296635    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:29.296655    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:29.296667    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:29.296680    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:29.296693    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:29.296703    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:29.296709    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:29.296716    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:29.296723    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:29.296730    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:29.296737    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:29.296745    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:29.296751    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:29.296770    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:29.296790    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:29.296804    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:29.296818    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:29.296835    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:29.296848    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:31.298288    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 4
	I0731 10:53:31.298303    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:31.298388    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:31.299216    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:31.299260    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:31.299270    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:31.299281    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:31.299288    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:31.299310    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:31.299324    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:31.299332    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:31.299340    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:31.299352    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:31.299361    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:31.299368    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:31.299374    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:31.299390    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:31.299401    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:31.299428    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:31.299437    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:31.299445    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:31.299453    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:31.299460    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:31.299467    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:31.299473    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:31.299480    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:31.299486    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:31.299492    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:31.299504    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:33.301214    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 5
	I0731 10:53:33.301230    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:33.301314    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:33.302126    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:33.302193    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:33.302204    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:33.302212    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:33.302232    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:33.302240    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:33.302246    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:33.302252    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:33.302262    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:33.302269    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:33.302276    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:33.302283    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:33.302288    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:33.302296    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:33.302302    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:33.302310    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:33.302318    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:33.302333    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:33.302346    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:33.302354    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:33.302363    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:33.302378    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:33.302391    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:33.302400    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:33.302408    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:33.302417    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:35.303293    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 6
	I0731 10:53:35.303308    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:35.303416    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:35.304236    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:35.304292    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:35.304300    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:35.304309    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:35.304315    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:35.304322    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:35.304330    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:35.304349    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:35.304376    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:35.304385    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:35.304392    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:35.304397    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:35.304403    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:35.304410    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:35.304421    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:35.304430    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:35.304437    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:35.304458    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:35.304470    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:35.304486    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:35.304499    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:35.304507    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:35.304516    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:35.304531    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:35.304539    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:35.304547    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:37.304813    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 7
	I0731 10:53:37.304840    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:37.304922    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:37.305819    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:37.305866    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:37.305877    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:37.305891    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:37.305898    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:37.305920    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:37.305932    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:37.305957    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:37.305970    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:37.305984    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:37.305992    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:37.306001    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:37.306007    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:37.306014    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:37.306023    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:37.306034    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:37.306045    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:37.306063    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:37.306077    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:37.306085    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:37.306091    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:37.306099    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:37.306111    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:37.306119    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:37.306127    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:37.306134    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:39.306925    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 8
	I0731 10:53:39.307009    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:39.307080    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:39.307887    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:39.307949    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:39.307961    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:39.307968    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:39.307978    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:39.307986    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:39.307994    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:39.308002    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:39.308043    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:39.308053    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:39.308064    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:39.308077    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:39.308086    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:39.308094    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:39.308106    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:39.308114    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:39.308126    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:39.308136    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:39.308145    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:39.308151    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:39.308159    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:39.308167    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:39.308174    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:39.308181    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:39.308195    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:39.308206    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:41.308289    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 9
	I0731 10:53:41.308318    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:41.308479    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:41.309334    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:41.309362    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:41.309376    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:41.309385    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:41.309391    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:41.309398    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:41.309404    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:41.309412    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:41.309418    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:41.309431    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:41.309443    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:41.309454    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:41.309462    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:41.309471    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:41.309489    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:41.309498    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:41.309505    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:41.309513    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:41.309520    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:41.309528    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:41.309536    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:41.309542    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:41.309548    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:41.309556    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:41.309563    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:41.309570    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:43.310151    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 10
	I0731 10:53:43.310167    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:43.310286    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:43.311186    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:43.311235    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:43.311246    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:43.311254    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:43.311261    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:43.311276    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:43.311296    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:43.311304    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:43.311311    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:43.311317    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:43.311332    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:43.311345    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:43.311354    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:43.311368    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:43.311378    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:43.311391    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:43.311402    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:43.311410    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:43.311423    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:43.311430    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:43.311438    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:43.311445    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:43.311452    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:43.311460    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:43.311466    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:43.311474    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:45.311440    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 11
	I0731 10:53:45.311453    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:45.311531    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:45.312381    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:45.312452    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:45.312463    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:45.312476    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:45.312486    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:45.312496    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:45.312504    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:45.312511    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:45.312517    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:45.312534    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:45.312546    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:45.312554    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:45.312563    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:45.312579    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:45.312591    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:45.312600    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:45.312608    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:45.312615    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:45.312623    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:45.312631    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:45.312639    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:45.312646    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:45.312651    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:45.312667    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:45.312679    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:45.312690    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:47.312901    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 12
	I0731 10:53:47.312916    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:47.312963    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:47.313789    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:47.313843    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:47.313858    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:47.313867    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:47.313873    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:47.313879    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:47.313885    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:47.313894    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:47.313901    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:47.313922    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:47.313937    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:47.313946    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:47.313954    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:47.313962    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:47.313989    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:47.314000    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:47.314007    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:47.314017    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:47.314024    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:47.314035    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:47.314042    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:47.314051    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:47.314058    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:47.314065    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:47.314072    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:47.314078    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:49.316019    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 13
	I0731 10:53:49.316036    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:49.316067    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:49.316902    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:49.316935    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:49.316950    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:49.316959    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:49.316968    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:49.316974    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:49.316997    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:49.317016    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:49.317023    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:49.317030    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:49.317038    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:49.317045    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:49.317053    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:49.317070    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:49.317092    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:49.317101    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:49.317109    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:49.317118    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:49.317126    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:49.317134    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:49.317142    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:49.317158    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:49.317170    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:49.317178    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:49.317184    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:49.317199    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:51.317094    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 14
	I0731 10:53:51.317107    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:51.317243    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:51.318159    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:51.318204    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:51.318215    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:51.318228    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:51.318238    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:51.318246    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:51.318253    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:51.318263    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:51.318269    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:51.318275    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:51.318284    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:51.318310    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:51.318323    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:51.318332    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:51.318344    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:51.318361    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:51.318383    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:51.318398    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:51.318412    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:51.318420    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:51.318428    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:51.318444    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:51.318456    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:51.318474    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:51.318486    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:51.318504    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:53.320235    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 15
	I0731 10:53:53.320248    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:53.320344    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:53.321219    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:53.321273    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:53.321280    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:53.321289    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:53.321295    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:53.321302    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:53.321311    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:53.321318    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:53.321325    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:53.321332    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:53.321338    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:53.321346    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:53.321373    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:53.321385    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:53.321395    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:53.321404    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:53.321417    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:53.321431    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:53.321448    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:53.321459    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:53.321472    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:53.321481    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:53.321491    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:53.321499    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:53.321506    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:53.321515    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:55.323417    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 16
	I0731 10:53:55.323431    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:55.323524    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:55.324552    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:55.324592    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:55.324606    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:55.324630    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:55.324640    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:55.324647    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:55.324656    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:55.324664    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:55.324671    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:55.324686    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:55.324698    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:55.324713    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:55.324721    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:55.324729    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:55.324735    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:55.324743    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:55.324751    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:55.324766    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:55.324774    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:55.324786    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:55.324796    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:55.324803    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:55.324811    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:55.324817    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:55.324825    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:55.324838    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:57.326702    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 17
	I0731 10:53:57.326717    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:57.326850    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:57.327719    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:57.327761    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:57.327772    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:57.327780    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:57.327786    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:57.327792    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:57.327805    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:57.327818    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:57.327826    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:57.327832    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:57.327838    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:57.327854    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:57.327861    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:57.327868    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:57.327876    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:57.327886    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:57.327895    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:57.327902    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:57.327908    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:57.327914    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:57.327922    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:57.327929    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:57.327940    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:57.327949    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:57.327956    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:57.327962    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:53:59.328198    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 18
	I0731 10:53:59.328212    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:53:59.328292    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:53:59.329082    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:53:59.329138    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:53:59.329153    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:53:59.329160    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:53:59.329166    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:53:59.329174    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:53:59.329201    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:53:59.329216    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:53:59.329240    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:53:59.329249    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:53:59.329257    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:53:59.329272    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:53:59.329284    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:53:59.329296    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:53:59.329306    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:53:59.329312    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:53:59.329319    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:53:59.329327    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:53:59.329339    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:53:59.329345    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:53:59.329351    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:53:59.329358    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:53:59.329364    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:53:59.329369    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:53:59.329375    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:53:59.329382    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:01.330901    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 19
	I0731 10:54:01.330918    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:01.331057    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:01.331931    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:01.331978    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:01.331991    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:01.332005    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:01.332019    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:01.332028    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:01.332035    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:01.332041    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:01.332048    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:01.332057    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:01.332063    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:01.332069    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:01.332085    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:01.332092    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:01.332100    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:01.332108    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:01.332118    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:01.332127    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:01.332133    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:01.332139    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:01.332145    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:01.332154    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:01.332160    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:01.332166    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:01.332181    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:01.332189    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:03.334162    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 20
	I0731 10:54:03.334177    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:03.334213    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:03.335144    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:03.335196    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:03.335206    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:03.335223    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:03.335233    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:03.335249    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:03.335261    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:03.335269    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:03.335277    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:03.335284    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:03.335291    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:03.335298    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:03.335305    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:03.335321    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:03.335329    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:03.335337    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:03.335343    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:03.335350    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:03.335358    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:03.335365    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:03.335372    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:03.335384    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:03.335395    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:03.335409    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:03.335418    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:03.335426    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:05.335657    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 21
	I0731 10:54:05.335670    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:05.335809    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:05.336640    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:05.336697    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:05.336705    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:05.336714    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:05.336720    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:05.336733    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:05.336744    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:05.336751    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:05.336758    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:05.336764    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:05.336772    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:05.336780    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:05.336786    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:05.336793    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:05.336799    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:05.336815    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:05.336826    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:05.336844    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:05.336852    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:05.336859    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:05.336866    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:05.336874    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:05.336889    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:05.336909    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:05.336922    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:05.336928    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:07.336998    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 22
	I0731 10:54:07.337017    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:07.337074    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:07.337886    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:07.337949    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:07.337959    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:07.337969    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:07.337976    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:07.337983    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:07.337992    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:07.338005    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:07.338015    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:07.338024    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:07.338032    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:07.338039    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:07.338046    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:07.338052    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:07.338059    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:07.338066    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:07.338074    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:07.338081    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:07.338086    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:07.338101    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:07.338115    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:07.338123    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:07.338131    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:07.338145    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:07.338153    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:07.338167    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:09.340152    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 23
	I0731 10:54:09.340165    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:09.340238    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:09.341050    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:09.341093    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:09.341102    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:09.341111    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:09.341119    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:09.341133    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:09.341150    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:09.341158    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:09.341164    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:09.341172    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:09.341177    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:09.341189    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:09.341197    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:09.341204    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:09.341210    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:09.341223    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:09.341236    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:09.341254    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:09.341263    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:09.341272    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:09.341280    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:09.341286    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:09.341295    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:09.341301    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:09.341309    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:09.341318    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:11.343264    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 24
	I0731 10:54:11.343279    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:11.343332    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:11.344225    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:11.344255    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:11.344267    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:11.344285    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:11.344297    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:11.344305    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:11.344311    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:11.344318    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:11.344326    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:11.344340    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:11.344352    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:11.344365    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:11.344376    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:11.344383    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:11.344393    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:11.344401    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:11.344409    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:11.344417    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:11.344423    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:11.344435    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:11.344452    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:11.344466    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:11.344480    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:11.344488    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:11.344496    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:11.344505    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:13.346379    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 25
	I0731 10:54:13.346396    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:13.346426    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:13.347240    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:13.347285    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:13.347299    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:13.347314    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:13.347335    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:13.347347    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:13.347357    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:13.347366    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:13.347385    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:13.347394    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:13.347402    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:13.347409    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:13.347416    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:13.347423    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:13.347436    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:13.347446    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:13.347460    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:13.347471    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:13.347478    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:13.347485    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:13.347506    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:13.347517    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:13.347526    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:13.347532    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:13.347538    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:13.347543    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:15.349449    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 26
	I0731 10:54:15.349465    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:15.349505    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:15.350364    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:15.350387    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:15.350400    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:15.350430    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:15.350440    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:15.350447    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:15.350459    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:15.350466    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:15.350472    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:15.350478    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:15.350485    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:15.350490    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:15.350501    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:15.350511    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:15.350519    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:15.350527    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:15.350535    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:15.350545    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:15.350551    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:15.350556    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:15.350565    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:15.350575    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:15.350584    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:15.350590    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:15.350605    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:15.350618    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:17.351696    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 27
	I0731 10:54:17.351713    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:17.351779    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:17.352831    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:17.352875    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:17.352885    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:17.352895    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:17.352907    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:17.352914    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:17.352920    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:17.352933    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:17.352958    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:17.352966    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:17.352973    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:17.352981    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:17.352992    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:17.353001    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:17.353012    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:17.353020    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:17.353027    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:17.353034    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:17.353041    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:17.353049    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:17.353055    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:17.353062    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:17.353069    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:17.353075    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:17.353082    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:17.353090    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:19.354081    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 28
	I0731 10:54:19.354096    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:19.354161    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:19.354967    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:19.355018    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:19.355033    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:19.355053    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:19.355064    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:19.355073    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:19.355083    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:19.355091    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:19.355097    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:19.355104    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:19.355111    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:19.355117    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:19.355124    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:19.355132    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:19.355139    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:19.355154    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:19.355164    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:19.355174    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:19.355185    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:19.355194    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:19.355207    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:19.355216    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:19.355224    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:19.355231    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:19.355247    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:19.355262    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:21.355305    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 29
	I0731 10:54:21.355320    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:21.355371    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:21.356248    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for 42:63:10:65:61:9 in /var/db/dhcpd_leases ...
	I0731 10:54:21.356312    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:54:21.356323    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:54:21.356333    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:54:21.356339    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:54:21.356346    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:54:21.356352    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:54:21.356372    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:54:21.356390    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:54:21.356398    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:54:21.356406    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:54:21.356414    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:54:21.356421    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:54:21.356428    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:54:21.356434    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:54:21.356440    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:54:21.356446    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:54:21.356455    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:54:21.356463    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:54:21.356471    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:54:21.356478    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:54:21.356486    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:54:21.356505    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:54:21.356516    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:54:21.356528    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:54:21.356536    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:54:23.358514    6225 client.go:171] duration metric: took 1m0.788548805s to LocalClient.Create
	I0731 10:54:25.360613    6225 start.go:128] duration metric: took 1m3.880854189s to createHost
	I0731 10:54:25.360644    6225 start.go:83] releasing machines lock for "force-systemd-env-141000", held for 1m3.880991031s
	W0731 10:54:25.360696    6225 start.go:714] error starting host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 42:63:10:65:61:9
	I0731 10:54:25.360998    6225 main.go:141] libmachine: Found binary path at /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:54:25.361023    6225 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:54:25.370091    6225 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54717
	I0731 10:54:25.370469    6225 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:54:25.370934    6225 main.go:141] libmachine: Using API Version  1
	I0731 10:54:25.370948    6225 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:54:25.371189    6225 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:54:25.371593    6225 main.go:141] libmachine: Found binary path at /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:54:25.371637    6225 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:54:25.380268    6225 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54719
	I0731 10:54:25.380720    6225 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:54:25.381160    6225 main.go:141] libmachine: Using API Version  1
	I0731 10:54:25.381175    6225 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:54:25.381415    6225 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:54:25.381548    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .GetState
	I0731 10:54:25.381656    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:25.381716    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:25.382717    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .DriverName
	I0731 10:54:25.403992    6225 out.go:177] * Deleting "force-systemd-env-141000" in hyperkit ...
	I0731 10:54:25.445993    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .Remove
	I0731 10:54:25.446136    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:25.446146    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:25.446215    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:25.447172    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:25.447240    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | waiting for graceful shutdown
	I0731 10:54:26.449352    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:26.449474    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:26.450454    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | waiting for graceful shutdown
	I0731 10:54:27.450938    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:27.451003    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:27.452635    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | waiting for graceful shutdown
	I0731 10:54:28.454643    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:28.454719    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:28.455317    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | waiting for graceful shutdown
	I0731 10:54:29.456111    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:29.456166    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:29.456807    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | waiting for graceful shutdown
	I0731 10:54:30.457513    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:54:30.457625    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6254
	I0731 10:54:30.458648    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | sending sigkill
	I0731 10:54:30.458660    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	W0731 10:54:30.469195    6225 out.go:239] ! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 42:63:10:65:61:9
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 42:63:10:65:61:9
	I0731 10:54:30.469213    6225 start.go:729] Will try again in 5 seconds ...
	I0731 10:54:30.477595    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:54:30 WARN : hyperkit: failed to read stderr: EOF
	I0731 10:54:30.477619    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:54:30 WARN : hyperkit: failed to read stdout: EOF
	I0731 10:54:35.470882    6225 start.go:360] acquireMachinesLock for force-systemd-env-141000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:55:28.322121    6225 start.go:364] duration metric: took 52.852313196s to acquireMachinesLock for "force-systemd-env-141000"
	I0731 10:55:28.322153    6225 start.go:93] Provisioning new machine with config: &{Name:force-systemd-env-141000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2048 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.30.3 ClusterName:force-systemd-env-141000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOp
timizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:55:28.322201    6225 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 10:55:28.343800    6225 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0731 10:55:28.343877    6225 main.go:141] libmachine: Found binary path at /Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit
	I0731 10:55:28.343903    6225 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:55:28.352334    6225 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54727
	I0731 10:55:28.352704    6225 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:55:28.353020    6225 main.go:141] libmachine: Using API Version  1
	I0731 10:55:28.353029    6225 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:55:28.353305    6225 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:55:28.353444    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .GetMachineName
	I0731 10:55:28.353530    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .DriverName
	I0731 10:55:28.353640    6225 start.go:159] libmachine.API.Create for "force-systemd-env-141000" (driver="hyperkit")
	I0731 10:55:28.353675    6225 client.go:168] LocalClient.Create starting
	I0731 10:55:28.353702    6225 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 10:55:28.353756    6225 main.go:141] libmachine: Decoding PEM data...
	I0731 10:55:28.353768    6225 main.go:141] libmachine: Parsing certificate...
	I0731 10:55:28.353813    6225 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 10:55:28.353851    6225 main.go:141] libmachine: Decoding PEM data...
	I0731 10:55:28.353862    6225 main.go:141] libmachine: Parsing certificate...
	I0731 10:55:28.353874    6225 main.go:141] libmachine: Running pre-create checks...
	I0731 10:55:28.353880    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .PreCreateCheck
	I0731 10:55:28.353961    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:28.354030    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .GetConfigRaw
	I0731 10:55:28.364835    6225 main.go:141] libmachine: Creating machine...
	I0731 10:55:28.364846    6225 main.go:141] libmachine: (force-systemd-env-141000) Calling .Create
	I0731 10:55:28.364933    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:28.365043    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | I0731 10:55:28.364919    6289 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:55:28.365097    6225 main.go:141] libmachine: (force-systemd-env-141000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 10:55:28.591016    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | I0731 10:55:28.590918    6289 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/id_rsa...
	I0731 10:55:28.795461    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | I0731 10:55:28.795378    6289 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/force-systemd-env-141000.rawdisk...
	I0731 10:55:28.795472    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Writing magic tar header
	I0731 10:55:28.795481    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Writing SSH key tar header
	I0731 10:55:28.796055    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | I0731 10:55:28.796018    6289 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000 ...
	I0731 10:55:29.171176    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:29.171195    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/hyperkit.pid
	I0731 10:55:29.171265    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Using UUID 02bf5e2a-fc2c-4893-bcde-c1713f8cf064
	I0731 10:55:29.309465    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Generated MAC e:1c:a3:1b:1b:53
	I0731 10:55:29.309480    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-141000
	I0731 10:55:29.309527    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"02bf5e2a-fc2c-4893-bcde-c1713f8cf064", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ce270)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]str
ing(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:55:29.309556    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"02bf5e2a-fc2c-4893-bcde-c1713f8cf064", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001ce270)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]str
ing(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:55:29.309606    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "02bf5e2a-fc2c-4893-bcde-c1713f8cf064", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/force-systemd-env-141000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-sys
temd-env-141000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-141000"}
	I0731 10:55:29.309641    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 02bf5e2a-fc2c-4893-bcde-c1713f8cf064 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/force-systemd-env-141000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/bzimage,/Users/jenkins/minikube-integration/19
349-1046/.minikube/machines/force-systemd-env-141000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-141000"
	I0731 10:55:29.309650    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:55:29.312633    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 DEBUG: hyperkit: Pid is 6290
	I0731 10:55:29.313133    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 0
	I0731 10:55:29.313149    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:29.313214    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:29.314166    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:29.314228    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:29.314244    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:29.314259    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:29.314269    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:29.314283    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:29.314324    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:29.314348    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:29.314360    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:29.314375    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:29.314386    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:29.314394    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:29.314402    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:29.314409    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:29.314483    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:29.314516    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:29.314533    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:29.314545    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:29.314562    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:29.314578    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:29.314592    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:29.314609    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:29.314624    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:29.314635    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:29.314661    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:29.314675    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:29.320130    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:55:29.328164    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/force-systemd-env-141000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:55:29.329082    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:55:29.329096    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:55:29.329103    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:55:29.329109    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:55:29.704800    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:55:29.704817    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:55:29.819328    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:55:29.819348    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:55:29.819386    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:55:29.819405    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:55:29.820215    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:55:29.820225    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:29 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:55:31.316253    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 1
	I0731 10:55:31.316273    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:31.316373    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:31.317192    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:31.317256    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:31.317276    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:31.317294    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:31.317301    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:31.317310    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:31.317321    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:31.317344    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:31.317351    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:31.317357    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:31.317364    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:31.317371    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:31.317377    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:31.317383    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:31.317389    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:31.317397    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:31.317404    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:31.317411    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:31.317419    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:31.317426    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:31.317431    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:31.317438    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:31.317445    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:31.317451    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:31.317459    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:31.317467    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:33.318704    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 2
	I0731 10:55:33.318719    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:33.318805    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:33.319715    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:33.319768    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:33.319784    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:33.319796    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:33.319803    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:33.319813    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:33.319829    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:33.319836    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:33.319844    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:33.319852    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:33.319857    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:33.319864    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:33.319871    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:33.319878    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:33.319886    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:33.319895    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:33.319901    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:33.319908    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:33.319914    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:33.319926    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:33.319932    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:33.319938    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:33.319946    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:33.319954    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:33.319960    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:33.319968    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:35.190317    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:35 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 10:55:35.190441    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:35 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 10:55:35.190450    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:35 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 10:55:35.210205    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | 2024/07/31 10:55:35 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 10:55:35.321309    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 3
	I0731 10:55:35.321345    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:35.321537    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:35.323059    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:35.323197    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:35.323221    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:35.323266    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:35.323287    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:35.323310    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:35.323323    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:35.323333    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:35.323346    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:35.323355    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:35.323373    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:35.323401    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:35.323420    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:35.323431    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:35.323440    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:35.323449    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:35.323458    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:35.323477    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:35.323499    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:35.323511    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:35.323523    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:35.323540    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:35.323555    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:35.323576    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:35.323593    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:35.323607    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:37.324564    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 4
	I0731 10:55:37.324578    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:37.324651    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:37.325466    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:37.325551    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:37.325565    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:37.325577    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:37.325587    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:37.325595    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:37.325620    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:37.325630    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:37.325642    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:37.325652    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:37.325660    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:37.325666    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:37.325677    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:37.325685    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:37.325693    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:37.325701    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:37.325710    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:37.325716    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:37.325723    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:37.325731    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:37.325742    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:37.325753    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:37.325768    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:37.325780    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:37.325789    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:37.325796    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:39.326473    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 5
	I0731 10:55:39.326494    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:39.326541    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:39.327359    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:39.327422    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:39.327431    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:39.327439    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:39.327456    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:39.327465    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:39.327471    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:39.327479    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:39.327486    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:39.327492    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:39.327499    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:39.327505    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:39.327512    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:39.327530    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:39.327542    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:39.327557    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:39.327570    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:39.327578    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:39.327586    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:39.327592    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:39.327608    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:39.327618    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:39.327627    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:39.327635    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:39.327646    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:39.327655    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:41.329229    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 6
	I0731 10:55:41.329246    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:41.329379    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:41.330241    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:41.330271    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:41.330282    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:41.330296    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:41.330306    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:41.330321    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:41.330337    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:41.330353    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:41.330366    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:41.330375    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:41.330381    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:41.330400    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:41.330410    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:41.330417    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:41.330426    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:41.330433    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:41.330441    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:41.330447    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:41.330454    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:41.330461    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:41.330469    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:41.330486    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:41.330497    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:41.330508    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:41.330516    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:41.330525    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:43.332222    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 7
	I0731 10:55:43.332238    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:43.332328    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:43.333334    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:43.333389    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:43.333400    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:43.333416    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:43.333440    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:43.333447    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:43.333453    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:43.333459    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:43.333467    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:43.333474    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:43.333483    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:43.333491    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:43.333500    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:43.333508    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:43.333516    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:43.333522    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:43.333531    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:43.333538    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:43.333544    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:43.333551    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:43.333557    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:43.333564    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:43.333570    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:43.333575    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:43.333580    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:43.333588    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:45.335260    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 8
	I0731 10:55:45.335276    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:45.335337    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:45.336242    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:45.336285    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:45.336294    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:45.336304    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:45.336323    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:45.336335    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:45.336342    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:45.336348    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:45.336356    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:45.336373    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:45.336387    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:45.336404    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:45.336418    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:45.336427    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:45.336432    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:45.336448    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:45.336467    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:45.336476    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:45.336484    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:45.336492    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:45.336500    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:45.336508    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:45.336515    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:45.336522    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:45.336528    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:45.336543    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:47.337576    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 9
	I0731 10:55:47.337588    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:47.337725    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:47.338681    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:47.338735    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:47.338747    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:47.338757    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:47.338769    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:47.338778    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:47.338783    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:47.338796    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:47.338802    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:47.338812    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:47.338831    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:47.338840    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:47.338848    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:47.338865    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:47.338887    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:47.338895    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:47.338917    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:47.338929    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:47.338937    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:47.338970    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:47.338990    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:47.339001    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:47.339009    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:47.339016    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:47.339023    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:47.339045    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:49.340938    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 10
	I0731 10:55:49.340951    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:49.341024    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:49.341882    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:49.341925    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:49.341937    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:49.341965    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:49.341980    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:49.341989    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:49.342003    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:49.342012    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:49.342021    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:49.342031    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:49.342039    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:49.342047    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:49.342065    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:49.342078    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:49.342087    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:49.342094    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:49.342110    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:49.342122    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:49.342131    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:49.342139    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:49.342146    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:49.342154    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:49.342169    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:49.342177    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:49.342186    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:49.342194    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:51.342527    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 11
	I0731 10:55:51.342541    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:51.342652    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:51.343474    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:51.343524    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:51.343534    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:51.343549    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:51.343559    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:51.343566    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:51.343573    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:51.343579    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:51.343587    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:51.343600    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:51.343610    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:51.343618    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:51.343625    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:51.343632    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:51.343638    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:51.343653    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:51.343679    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:51.343688    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:51.343695    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:51.343704    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:51.343711    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:51.343717    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:51.343722    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:51.343729    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:51.343735    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:51.343741    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:53.343662    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 12
	I0731 10:55:53.343679    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:53.343800    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:53.344812    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:53.344858    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:53.344869    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:53.344878    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:53.344886    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:53.344893    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:53.344901    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:53.344908    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:53.344920    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:53.344930    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:53.344936    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:53.344945    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:53.344951    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:53.344966    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:53.344975    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:53.344989    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:53.344996    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:53.345012    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:53.345018    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:53.345026    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:53.345039    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:53.345054    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:53.345067    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:53.345076    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:53.345084    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:53.345093    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:55.345256    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 13
	I0731 10:55:55.345273    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:55.345380    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:55.346216    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:55.346236    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:55.346266    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:55.346282    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:55.346291    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:55.346298    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:55.346308    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:55.346317    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:55.346323    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:55.346329    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:55.346337    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:55.346344    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:55.346350    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:55.346371    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:55.346383    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:55.346391    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:55.346398    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:55.346407    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:55.346421    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:55.346433    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:55.346440    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:55.346449    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:55.346458    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:55.346466    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:55.346474    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:55.346481    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:57.348401    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 14
	I0731 10:55:57.348415    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:57.348531    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:57.349371    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:57.349427    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:57.349442    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:57.349453    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:57.349461    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:57.349476    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:57.349484    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:57.349494    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:57.349507    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:57.349516    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:57.349534    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:57.349544    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:57.349566    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:57.349582    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:57.349601    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:57.349618    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:57.349631    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:57.349651    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:57.349662    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:57.349670    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:57.349679    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:57.349685    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:57.349693    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:57.349700    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:57.349707    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:57.349715    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:55:59.350197    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 15
	I0731 10:55:59.350215    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:55:59.350339    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:55:59.351182    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:55:59.351236    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:55:59.351248    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:55:59.351257    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:55:59.351263    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:55:59.351270    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:55:59.351279    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:55:59.351285    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:55:59.351293    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:55:59.351300    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:55:59.351309    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:55:59.351321    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:55:59.351332    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:55:59.351349    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:55:59.351363    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:55:59.351372    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:55:59.351381    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:55:59.351389    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:55:59.351397    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:55:59.351404    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:55:59.351411    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:55:59.351418    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:55:59.351426    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:55:59.351432    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:55:59.351438    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:55:59.351454    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:01.353419    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 16
	I0731 10:56:01.353434    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:01.353502    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:01.354350    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:01.354405    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:01.354418    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:01.354427    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:01.354433    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:01.354439    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:01.354446    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:01.354453    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:01.354460    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:01.354468    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:01.354475    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:01.354483    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:01.354501    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:01.354514    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:01.354522    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:01.354528    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:01.354545    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:01.354555    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:01.354565    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:01.354572    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:01.354578    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:01.354586    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:01.354593    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:01.354608    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:01.354616    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:01.354631    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:03.356527    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 17
	I0731 10:56:03.356541    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:03.356640    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:03.357493    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:03.357540    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:03.357550    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:03.357570    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:03.357579    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:03.357592    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:03.357604    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:03.357638    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:03.357647    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:03.357654    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:03.357663    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:03.357671    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:03.357679    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:03.357703    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:03.357717    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:03.357740    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:03.357754    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:03.357762    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:03.357772    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:03.357779    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:03.357788    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:03.357803    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:03.357819    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:03.357831    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:03.357838    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:03.357847    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:05.358624    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 18
	I0731 10:56:05.358665    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:05.358732    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:05.359685    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:05.359744    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:05.359753    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:05.359765    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:05.359776    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:05.359795    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:05.359802    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:05.359817    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:05.359824    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:05.359831    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:05.359837    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:05.359845    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:05.359856    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:05.359863    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:05.359868    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:05.359882    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:05.359908    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:05.359923    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:05.359937    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:05.359945    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:05.359958    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:05.359966    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:05.359974    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:05.359981    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:05.359989    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:05.360003    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:07.360747    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 19
	I0731 10:56:07.360762    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:07.360827    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:07.361699    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:07.361773    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:07.361787    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:07.361797    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:07.361823    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:07.361831    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:07.361841    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:07.361847    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:07.361875    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:07.361895    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:07.361908    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:07.361919    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:07.361936    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:07.361948    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:07.361957    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:07.361965    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:07.361977    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:07.361991    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:07.362001    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:07.362009    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:07.362016    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:07.362035    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:07.362042    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:07.362049    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:07.362056    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:07.362064    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:09.363930    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 20
	I0731 10:56:09.363988    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:09.364072    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:09.364875    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:09.364920    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:09.364929    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:09.364950    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:09.364965    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:09.364971    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:09.364979    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:09.364986    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:09.364993    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:09.365003    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:09.365009    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:09.365017    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:09.365035    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:09.365052    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:09.365066    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:09.365079    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:09.365089    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:09.365097    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:09.365104    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:09.365112    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:09.365135    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:09.365146    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:09.365155    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:09.365161    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:09.365171    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:09.365180    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:11.365560    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 21
	I0731 10:56:11.365576    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:11.365648    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:11.366591    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:11.366639    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:11.366653    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:11.366665    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:11.366673    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:11.366680    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:11.366686    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:11.366699    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:11.366706    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:11.366712    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:11.366725    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:11.366733    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:11.366741    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:11.366748    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:11.366758    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:11.366766    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:11.366776    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:11.366785    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:11.366794    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:11.366800    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:11.366807    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:11.366814    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:11.366821    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:11.366829    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:11.366836    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:11.366841    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:13.368276    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 22
	I0731 10:56:13.368291    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:13.368370    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:13.369233    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:13.369279    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:13.369287    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:13.369306    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:13.369316    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:13.369324    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:13.369331    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:13.369338    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:13.369344    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:13.369356    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:13.369370    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:13.369388    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:13.369398    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:13.369406    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:13.369415    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:13.369422    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:13.369432    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:13.369440    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:13.369447    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:13.369454    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:13.369461    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:13.369475    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:13.369489    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:13.369496    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:13.369502    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:13.369509    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:15.369470    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 23
	I0731 10:56:15.369485    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:15.369557    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:15.370361    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:15.370419    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:15.370431    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:15.370440    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:15.370446    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:15.370457    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:15.370468    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:15.370489    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:15.370497    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:15.370507    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:15.370517    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:15.370532    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:15.370544    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:15.370552    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:15.370560    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:15.370567    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:15.370585    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:15.370609    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:15.370621    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:15.370628    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:15.370635    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:15.370641    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:15.370648    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:15.370655    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:15.370661    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:15.370685    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:17.370807    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 24
	I0731 10:56:17.370822    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:17.370887    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:17.371721    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:17.371772    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:17.371781    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:17.371789    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:17.371798    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:17.371813    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:17.371823    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:17.371831    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:17.371838    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:17.371845    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:17.371853    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:17.371859    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:17.371867    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:17.371876    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:17.371895    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:17.371906    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:17.371918    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:17.371938    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:17.371951    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:17.371959    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:17.371967    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:17.371983    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:17.372011    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:17.372052    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:17.372060    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:17.372071    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:19.372339    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 25
	I0731 10:56:19.372366    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:19.372434    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:19.373257    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:19.373316    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:19.373325    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:19.373349    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:19.373364    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:19.373373    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:19.373379    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:19.373388    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:19.373396    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:19.373402    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:19.373409    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:19.373415    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:19.373423    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:19.373438    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:19.373446    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:19.373453    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:19.373460    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:19.373467    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:19.373475    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:19.373490    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:19.373502    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:19.373511    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:19.373519    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:19.373531    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:19.373540    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:19.373549    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:21.375461    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 26
	I0731 10:56:21.375476    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:21.375588    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:21.376571    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:21.376622    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:21.376641    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:21.376660    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:21.376682    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:21.376692    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:21.376700    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:21.376709    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:21.376715    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:21.376722    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:21.376728    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:21.376735    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:21.376741    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:21.376756    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:21.376767    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:21.376781    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:21.376792    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:21.376807    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:21.376816    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:21.376823    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:21.376832    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:21.376839    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:21.376853    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:21.376860    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:21.376866    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:21.376873    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:23.378779    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 27
	I0731 10:56:23.378794    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:23.378876    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:23.379697    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:23.379749    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:23.379771    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:23.379786    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:23.379793    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:23.379806    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:23.379813    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:23.379821    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:23.379827    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:23.379836    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:23.379844    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:23.379851    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:23.379859    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:23.379866    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:23.379872    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:23.379885    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:23.379898    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:23.379914    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:23.379922    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:23.379931    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:23.379939    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:23.379947    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:23.379954    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:23.379965    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:23.379973    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:23.379991    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:25.379918    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 28
	I0731 10:56:25.379934    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:25.379945    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:25.380899    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:25.380941    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:25.380954    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:25.380982    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:25.380994    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:25.381002    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:25.381008    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:25.381015    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:25.381022    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:25.381041    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:25.381066    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:25.381081    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:25.381089    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:25.381096    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:25.381104    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:25.381112    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:25.381120    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:25.381128    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:25.381144    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:25.381152    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:25.381158    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:25.381164    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:25.381187    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:25.381196    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:25.381201    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:25.381217    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:27.383188    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Attempt 29
	I0731 10:56:27.383207    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | exe=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0731 10:56:27.383259    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | hyperkit pid from json: 6290
	I0731 10:56:27.384132    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Searching for e:1c:a3:1b:1b:53 in /var/db/dhcpd_leases ...
	I0731 10:56:27.384152    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | Found 24 entries in /var/db/dhcpd_leases!
	I0731 10:56:27.384168    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.25 HWAddress:da:4c:82:ac:eb:ca ID:1,da:4c:82:ac:eb:ca Lease:0x66abcb82}
	I0731 10:56:27.384180    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.24 HWAddress:a:e1:ca:1e:61:19 ID:1,a:e1:ca:1e:61:19 Lease:0x66aa7972}
	I0731 10:56:27.384193    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.23 HWAddress:86:5b:eb:af:29:e ID:1,86:5b:eb:af:29:e Lease:0x66abca25}
	I0731 10:56:27.384210    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.22 HWAddress:5e:52:7c:ac:c9:f0 ID:1,5e:52:7c:ac:c9:f0 Lease:0x66abca32}
	I0731 10:56:27.384219    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.21 HWAddress:fa:d5:16:97:6c:fb ID:1,fa:d5:16:97:6c:fb Lease:0x66abc9e1}
	I0731 10:56:27.384226    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.20 HWAddress:fe:c8:d1:ca:3b:79 ID:1,fe:c8:d1:ca:3b:79 Lease:0x66abc8e4}
	I0731 10:56:27.384234    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.19 HWAddress:de:49:7e:5:fd:fc ID:1,de:49:7e:5:fd:fc Lease:0x66abc807}
	I0731 10:56:27.384241    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.18 HWAddress:12:af:56:b3:66:56 ID:1,12:af:56:b3:66:56 Lease:0x66abc708}
	I0731 10:56:27.384249    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.17 HWAddress:56:42:c1:6:46:4b ID:1,56:42:c1:6:46:4b Lease:0x66aa74ff}
	I0731 10:56:27.384255    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.16 HWAddress:16:4a:33:14:b6:82 ID:1,16:4a:33:14:b6:82 Lease:0x66abc6de}
	I0731 10:56:27.384264    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.15 HWAddress:2e:dd:d2:a4:c2:89 ID:1,2e:dd:d2:a4:c2:89 Lease:0x66abc69b}
	I0731 10:56:27.384271    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.14 HWAddress:32:5a:49:c6:7e:12 ID:1,32:5a:49:c6:7e:12 Lease:0x66abc463}
	I0731 10:56:27.384278    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.13 HWAddress:9e:14:70:a:cd:e4 ID:1,9e:14:70:a:cd:e4 Lease:0x66abc43a}
	I0731 10:56:27.384286    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.12 HWAddress:32:3e:7e:4b:10:d0 ID:1,32:3e:7e:4b:10:d0 Lease:0x66abc3d5}
	I0731 10:56:27.384293    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.11 HWAddress:82:f8:e9:d8:ea:5b ID:1,82:f8:e9:d8:ea:5b Lease:0x66aa7241}
	I0731 10:56:27.384302    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:3e:83:1f:81:9a:13 ID:1,3e:83:1f:81:9a:13 Lease:0x66aa70d8}
	I0731 10:56:27.384310    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:36:7:7a:57:98:ed ID:1,36:7:7a:57:98:ed Lease:0x66abc1c5}
	I0731 10:56:27.384318    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa70a6}
	I0731 10:56:27.384325    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:56:27.384334    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:56:27.384341    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:56:27.384349    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 10:56:27.384363    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 10:56:27.384375    6225 main.go:141] libmachine: (force-systemd-env-141000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 10:56:29.384578    6225 client.go:171] duration metric: took 1m1.03216738s to LocalClient.Create
	I0731 10:56:31.386473    6225 start.go:128] duration metric: took 1m3.065573856s to createHost
	I0731 10:56:31.386491    6225 start.go:83] releasing machines lock for "force-systemd-env-141000", held for 1m3.065666993s
	W0731 10:56:31.386595    6225 out.go:239] * Failed to start hyperkit VM. Running "minikube delete -p force-systemd-env-141000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for e:1c:a3:1b:1b:53
	* Failed to start hyperkit VM. Running "minikube delete -p force-systemd-env-141000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for e:1c:a3:1b:1b:53
	I0731 10:56:31.505029    6225 out.go:177] 
	W0731 10:56:31.528400    6225 out.go:239] X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for e:1c:a3:1b:1b:53
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for e:1c:a3:1b:1b:53
	W0731 10:56:31.528412    6225 out.go:239] * 
	* 
	W0731 10:56:31.529117    6225 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:56:31.614105    6225 out.go:177] 

                                                
                                                
** /stderr **
docker_test.go:157: failed to start minikube with args: "out/minikube-darwin-amd64 start -p force-systemd-env-141000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit " : exit status 80
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-141000 ssh "docker info --format {{.CgroupDriver}}"
docker_test.go:110: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p force-systemd-env-141000 ssh "docker info --format {{.CgroupDriver}}": exit status 50 (180.775572ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to DRV_CP_ENDPOINT: Unable to get control-plane node force-systemd-env-141000 endpoint: failed to lookup ip for ""
	* Suggestion: 
	
	    Recreate the cluster by running:
	    minikube delete <no value>
	    minikube start <no value>

                                                
                                                
** /stderr **
docker_test.go:112: failed to get docker cgroup driver. args "out/minikube-darwin-amd64 -p force-systemd-env-141000 ssh \"docker info --format {{.CgroupDriver}}\"": exit status 50
docker_test.go:166: *** TestForceSystemdEnv FAILED at 2024-07-31 10:56:31.948109 -0700 PDT m=+4637.045551348
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-env-141000 -n force-systemd-env-141000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-env-141000 -n force-systemd-env-141000: exit status 7 (77.732208ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 10:56:32.023887    6296 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 10:56:32.023910    6296 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "force-systemd-env-141000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:175: Cleaning up "force-systemd-env-141000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-141000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-141000: (5.259683031s)
--- FAIL: TestForceSystemdEnv (201.73s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (79.37s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 node add -p ha-393000 -v=7 --alsologtostderr
E0731 09:56:54.503868    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:54.509731    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:54.520419    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:54.541747    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:54.582190    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:54.662634    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:54.822947    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:55.143872    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:55.785045    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:57.066175    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:56:59.627289    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:57:04.748598    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:57:14.989785    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 09:57:35.470590    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
ha_test.go:228: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p ha-393000 -v=7 --alsologtostderr: exit status 90 (1m16.385608455s)

                                                
                                                
-- stdout --
	* Adding node m04 to cluster ha-393000 as [worker]
	* Starting "ha-393000-m04" worker node in "ha-393000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:56:54.556126    3086 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:56:54.556406    3086 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:56:54.556417    3086 out.go:304] Setting ErrFile to fd 2...
	I0731 09:56:54.556424    3086 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:56:54.556618    3086 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:56:54.556987    3086 mustload.go:65] Loading cluster: ha-393000
	I0731 09:56:54.557280    3086 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:56:54.557677    3086 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:56:54.557710    3086 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:56:54.565987    3086 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51223
	I0731 09:56:54.566397    3086 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:56:54.566810    3086 main.go:141] libmachine: Using API Version  1
	I0731 09:56:54.566819    3086 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:56:54.567022    3086 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:56:54.567137    3086 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:56:54.567221    3086 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:54.567289    3086 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:56:54.568310    3086 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:56:54.568568    3086 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:56:54.568590    3086 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:56:54.576872    3086 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51225
	I0731 09:56:54.577231    3086 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:56:54.577549    3086 main.go:141] libmachine: Using API Version  1
	I0731 09:56:54.577562    3086 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:56:54.577774    3086 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:56:54.577895    3086 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:56:54.578244    3086 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:56:54.578269    3086 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:56:54.586617    3086 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51227
	I0731 09:56:54.586943    3086 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:56:54.587272    3086 main.go:141] libmachine: Using API Version  1
	I0731 09:56:54.587284    3086 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:56:54.587508    3086 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:56:54.587615    3086 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:56:54.587708    3086 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:54.587774    3086 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:56:54.588791    3086 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:56:54.589047    3086 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:56:54.589076    3086 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:56:54.597427    3086 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51229
	I0731 09:56:54.597758    3086 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:56:54.598075    3086 main.go:141] libmachine: Using API Version  1
	I0731 09:56:54.598086    3086 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:56:54.598310    3086 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:56:54.598422    3086 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:56:54.598775    3086 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:56:54.598800    3086 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:56:54.607234    3086 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51231
	I0731 09:56:54.607560    3086 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:56:54.607856    3086 main.go:141] libmachine: Using API Version  1
	I0731 09:56:54.607867    3086 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:56:54.608074    3086 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:56:54.608184    3086 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:56:54.608275    3086 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:54.608346    3086 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:56:54.609346    3086 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:56:54.609596    3086 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:56:54.609622    3086 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:56:54.617863    3086 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51233
	I0731 09:56:54.618170    3086 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:56:54.618560    3086 main.go:141] libmachine: Using API Version  1
	I0731 09:56:54.618576    3086 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:56:54.618795    3086 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:56:54.618936    3086 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:56:54.619046    3086 api_server.go:166] Checking apiserver status ...
	I0731 09:56:54.619105    3086 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:56:54.619124    3086 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:56:54.619241    3086 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:56:54.619339    3086 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:56:54.619429    3086 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:56:54.619573    3086 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:56:54.662577    3086 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:56:54.670065    3086 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:56:54.670129    3086 ssh_runner.go:195] Run: ls
	I0731 09:56:54.673567    3086 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:56:54.676890    3086 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:56:54.698693    3086 out.go:177] * Adding node m04 to cluster ha-393000 as [worker]
	I0731 09:56:54.719613    3086 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:56:54.719705    3086 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:56:54.741431    3086 out.go:177] * Starting "ha-393000-m04" worker node in "ha-393000" cluster
	I0731 09:56:54.762398    3086 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:56:54.762448    3086 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 09:56:54.762472    3086 cache.go:56] Caching tarball of preloaded images
	I0731 09:56:54.762714    3086 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:56:54.762734    3086 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:56:54.762864    3086 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:56:54.763632    3086 start.go:360] acquireMachinesLock for ha-393000-m04: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:56:54.763801    3086 start.go:364] duration metric: took 136.173µs to acquireMachinesLock for "ha-393000-m04"
	I0731 09:56:54.763835    3086 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP: Port:0 KubernetesVersion:v1.30.3 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false
freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountG
ID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m04 IP: Port:0 KubernetesVersion:v1.30.3 ContainerRuntime: ControlPlane:false Worker:true}
	I0731 09:56:54.764064    3086 start.go:125] createHost starting for "m04" (driver="hyperkit")
	I0731 09:56:54.785220    3086 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:56:54.785443    3086 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:56:54.785487    3086 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:56:54.795586    3086 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51237
	I0731 09:56:54.795934    3086 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:56:54.796242    3086 main.go:141] libmachine: Using API Version  1
	I0731 09:56:54.796251    3086 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:56:54.796446    3086 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:56:54.796559    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 09:56:54.796648    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:56:54.796750    3086 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:56:54.796772    3086 client.go:168] LocalClient.Create starting
	I0731 09:56:54.796805    3086 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:56:54.796863    3086 main.go:141] libmachine: Decoding PEM data...
	I0731 09:56:54.796876    3086 main.go:141] libmachine: Parsing certificate...
	I0731 09:56:54.796937    3086 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:56:54.796979    3086 main.go:141] libmachine: Decoding PEM data...
	I0731 09:56:54.796990    3086 main.go:141] libmachine: Parsing certificate...
	I0731 09:56:54.797011    3086 main.go:141] libmachine: Running pre-create checks...
	I0731 09:56:54.797017    3086 main.go:141] libmachine: (ha-393000-m04) Calling .PreCreateCheck
	I0731 09:56:54.797091    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:54.797123    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetConfigRaw
	I0731 09:56:54.797599    3086 main.go:141] libmachine: Creating machine...
	I0731 09:56:54.797611    3086 main.go:141] libmachine: (ha-393000-m04) Calling .Create
	I0731 09:56:54.797693    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:54.797805    3086 main.go:141] libmachine: (ha-393000-m04) DBG | I0731 09:56:54.797687    3094 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:56:54.797867    3086 main.go:141] libmachine: (ha-393000-m04) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:56:55.163661    3086 main.go:141] libmachine: (ha-393000-m04) DBG | I0731 09:56:55.163603    3094 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa...
	I0731 09:56:55.249495    3086 main.go:141] libmachine: (ha-393000-m04) DBG | I0731 09:56:55.249437    3094 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk...
	I0731 09:56:55.249517    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Writing magic tar header
	I0731 09:56:55.249530    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Writing SSH key tar header
	I0731 09:56:55.250149    3086 main.go:141] libmachine: (ha-393000-m04) DBG | I0731 09:56:55.250121    3094 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04 ...
	I0731 09:56:55.797391    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:55.797406    3086 main.go:141] libmachine: (ha-393000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid
	I0731 09:56:55.797438    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Using UUID 8a49f5e0-ba79-41ac-9a76-c032dc065628
	I0731 09:56:55.826312    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Generated MAC d2:d8:fb:1d:1:ee
	I0731 09:56:55.826333    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:56:55.826368    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:56:55.826390    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:56:55.826436    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "8a49f5e0-ba79-41ac-9a76-c032dc065628", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:56:55.826467    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 8a49f5e0-ba79-41ac-9a76-c032dc065628 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:56:55.826504    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:56:55.829506    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 DEBUG: hyperkit: Pid is 3095
	I0731 09:56:55.830427    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 0
	I0731 09:56:55.830449    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:55.830501    3086 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:56:55.831550    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 09:56:55.831636    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:56:55.831658    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:56:55.831700    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:56:55.831722    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:56:55.831734    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:56:55.831742    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:56:55.831751    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:56:55.837943    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:56:55.847737    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:56:55.848596    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:56:55.848645    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:56:55.848667    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:56:55.848701    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:55 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:56:56.237167    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:56 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:56:56.237182    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:56 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:56:56.352008    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:56:56.352025    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:56:56.352034    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:56:56.352040    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:56 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:56:56.352794    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:56 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:56:56.352804    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:56:56 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:56:57.833264    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 1
	I0731 09:56:57.833299    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:57.833390    3086 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:56:57.834194    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 09:56:57.834251    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:56:57.834269    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:56:57.834296    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:56:57.834308    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:56:57.834323    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:56:57.834336    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:56:57.834346    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:56:59.835162    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 2
	I0731 09:56:59.835192    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:56:59.835268    3086 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:56:59.836069    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 09:56:59.836120    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:56:59.836127    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:56:59.836135    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:56:59.836150    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:56:59.836158    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:56:59.836164    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:56:59.836171    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:57:01.837699    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 3
	I0731 09:57:01.837716    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:57:01.837847    3086 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:57:01.838688    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 09:57:01.838738    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:57:01.838755    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:57:01.838764    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:57:01.838776    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:57:01.838786    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:57:01.838792    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:57:01.838801    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:57:01.993504    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:57:01 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:57:01.993551    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:57:01 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:57:01.993561    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:57:01 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:57:02.017925    3086 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 09:57:02 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:57:03.839252    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 4
	I0731 09:57:03.839274    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:57:03.839374    3086 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:57:03.840290    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 09:57:03.840368    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:57:03.840384    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:57:03.840395    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:57:03.840401    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:57:03.840421    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:57:03.840433    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:57:03.840451    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:57:05.842111    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 5
	I0731 09:57:05.842141    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:57:05.842239    3086 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:57:05.843036    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 09:57:05.843092    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 09:57:05.843105    3086 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66abbe60}
	I0731 09:57:05.843114    3086 main.go:141] libmachine: (ha-393000-m04) DBG | Found match: d2:d8:fb:1d:1:ee
	I0731 09:57:05.843121    3086 main.go:141] libmachine: (ha-393000-m04) DBG | IP: 192.169.0.8
	I0731 09:57:05.843169    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetConfigRaw
	I0731 09:57:05.843756    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:57:05.843837    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:57:05.843920    3086 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:57:05.843929    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:57:05.844012    3086 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:57:05.844070    3086 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:57:05.844849    3086 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:57:05.844860    3086 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:57:05.844866    3086 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:57:05.844872    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:05.844947    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:05.845061    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:05.845140    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:05.845224    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:05.845355    3086 main.go:141] libmachine: Using SSH client type: native
	I0731 09:57:05.845577    3086 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xf6010c0] 0xf603e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 09:57:05.845585    3086 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:57:06.900311    3086 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:57:06.900326    3086 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:57:06.900333    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:06.900473    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:06.900555    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:06.900638    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:06.900724    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:06.900853    3086 main.go:141] libmachine: Using SSH client type: native
	I0731 09:57:06.901019    3086 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xf6010c0] 0xf603e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 09:57:06.901028    3086 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:57:06.956106    3086 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:57:06.956177    3086 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:57:06.956185    3086 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:57:06.956190    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 09:57:06.956327    3086 buildroot.go:166] provisioning hostname "ha-393000-m04"
	I0731 09:57:06.956337    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 09:57:06.956417    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:06.956512    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:06.956596    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:06.956675    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:06.956751    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:06.956870    3086 main.go:141] libmachine: Using SSH client type: native
	I0731 09:57:06.957013    3086 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xf6010c0] 0xf603e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 09:57:06.957021    3086 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m04 && echo "ha-393000-m04" | sudo tee /etc/hostname
	I0731 09:57:07.028543    3086 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m04
	
	I0731 09:57:07.028579    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:07.028721    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:07.028835    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.028930    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.029026    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:07.029171    3086 main.go:141] libmachine: Using SSH client type: native
	I0731 09:57:07.029328    3086 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xf6010c0] 0xf603e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 09:57:07.029341    3086 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:57:07.092146    3086 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:57:07.092169    3086 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:57:07.092191    3086 buildroot.go:174] setting up certificates
	I0731 09:57:07.092207    3086 provision.go:84] configureAuth start
	I0731 09:57:07.092214    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 09:57:07.092369    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:57:07.092462    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:07.092551    3086 provision.go:143] copyHostCerts
	I0731 09:57:07.092585    3086 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:57:07.092641    3086 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:57:07.092649    3086 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:57:07.092788    3086 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:57:07.093013    3086 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:57:07.093053    3086 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:57:07.093058    3086 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:57:07.093162    3086 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:57:07.093301    3086 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:57:07.093342    3086 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:57:07.093347    3086 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:57:07.093423    3086 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:57:07.093564    3086 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m04 san=[127.0.0.1 192.169.0.8 ha-393000-m04 localhost minikube]
	I0731 09:57:07.366208    3086 provision.go:177] copyRemoteCerts
	I0731 09:57:07.366259    3086 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:57:07.366285    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:07.366434    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:07.366527    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.366641    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:07.366729    3086 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:57:07.401469    3086 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:57:07.401549    3086 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:57:07.421435    3086 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:57:07.421508    3086 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:57:07.441382    3086 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:57:07.441452    3086 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:57:07.461492    3086 provision.go:87] duration metric: took 369.272339ms to configureAuth
	I0731 09:57:07.461507    3086 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:57:07.461690    3086 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:57:07.461703    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:57:07.461834    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:07.461932    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:07.462011    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.462092    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.462185    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:07.462299    3086 main.go:141] libmachine: Using SSH client type: native
	I0731 09:57:07.462420    3086 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xf6010c0] 0xf603e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 09:57:07.462428    3086 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:57:07.519225    3086 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:57:07.519237    3086 buildroot.go:70] root file system type: tmpfs
	I0731 09:57:07.519311    3086 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:57:07.519328    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:07.519509    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:07.519650    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.519777    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.519871    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:07.520016    3086 main.go:141] libmachine: Using SSH client type: native
	I0731 09:57:07.520160    3086 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xf6010c0] 0xf603e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 09:57:07.520207    3086 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:57:07.587309    3086 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:57:07.587332    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:07.587469    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:07.587570    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.587656    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:07.587746    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:07.587855    3086 main.go:141] libmachine: Using SSH client type: native
	I0731 09:57:07.588005    3086 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xf6010c0] 0xf603e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 09:57:07.588016    3086 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:57:09.116203    3086 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:57:09.116219    3086 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:57:09.116225    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetURL
	I0731 09:57:09.116384    3086 main.go:141] libmachine: Docker is up and running!
	I0731 09:57:09.116393    3086 main.go:141] libmachine: Reticulating splines...
	I0731 09:57:09.116398    3086 client.go:171] duration metric: took 14.319623079s to LocalClient.Create
	I0731 09:57:09.116410    3086 start.go:167] duration metric: took 14.319664103s to libmachine.API.Create "ha-393000"
	I0731 09:57:09.116419    3086 start.go:293] postStartSetup for "ha-393000-m04" (driver="hyperkit")
	I0731 09:57:09.116439    3086 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:57:09.116450    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:57:09.116596    3086 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:57:09.116608    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:09.116685    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:09.116780    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:09.116869    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:09.116948    3086 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:57:09.158477    3086 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:57:09.162891    3086 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:57:09.162909    3086 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:57:09.163024    3086 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:57:09.163213    3086 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:57:09.163222    3086 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:57:09.163438    3086 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:57:09.177539    3086 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:57:09.206074    3086 start.go:296] duration metric: took 89.644638ms for postStartSetup
	I0731 09:57:09.206112    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetConfigRaw
	I0731 09:57:09.206712    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:57:09.206890    3086 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:57:09.207252    3086 start.go:128] duration metric: took 14.443172777s to createHost
	I0731 09:57:09.207272    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:09.207376    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:09.207473    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:09.207559    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:09.207645    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:09.207758    3086 main.go:141] libmachine: Using SSH client type: native
	I0731 09:57:09.207880    3086 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xf6010c0] 0xf603e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 09:57:09.207888    3086 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0731 09:57:09.263810    3086 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445028.999903210
	
	I0731 09:57:09.263823    3086 fix.go:216] guest clock: 1722445028.999903210
	I0731 09:57:09.263828    3086 fix.go:229] Guest: 2024-07-31 09:57:08.99990321 -0700 PDT Remote: 2024-07-31 09:57:09.20726 -0700 PDT m=+14.687261773 (delta=-207.35679ms)
	I0731 09:57:09.263853    3086 fix.go:200] guest clock delta is within tolerance: -207.35679ms
	I0731 09:57:09.263857    3086 start.go:83] releasing machines lock for "ha-393000-m04", held for 14.50004854s
	I0731 09:57:09.263878    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:57:09.264029    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:57:09.264165    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:57:09.264571    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:57:09.264708    3086 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:57:09.264824    3086 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:57:09.264907    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:09.265010    3086 ssh_runner.go:195] Run: systemctl --version
	I0731 09:57:09.265028    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:57:09.265063    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:09.265204    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:57:09.265224    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:09.265338    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:57:09.265359    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:09.265540    3086 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:57:09.265559    3086 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:57:09.265711    3086 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:57:09.299562    3086 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 09:57:09.342104    3086 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:57:09.342178    3086 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:57:09.355944    3086 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:57:09.355960    3086 start.go:495] detecting cgroup driver to use...
	I0731 09:57:09.356062    3086 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:57:09.371440    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:57:09.380534    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:57:09.389565    3086 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:57:09.389618    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:57:09.398888    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:57:09.408034    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:57:09.417175    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:57:09.426346    3086 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:57:09.435516    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:57:09.443978    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:57:09.453018    3086 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:57:09.462111    3086 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:57:09.470219    3086 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:57:09.478374    3086 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:57:09.573504    3086 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:57:09.593433    3086 start.go:495] detecting cgroup driver to use...
	I0731 09:57:09.593511    3086 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:57:09.611616    3086 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:57:09.626222    3086 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:57:09.648328    3086 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:57:09.659707    3086 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:57:09.670705    3086 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:57:09.689406    3086 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:57:09.699922    3086 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:57:09.715837    3086 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:57:09.718987    3086 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:57:09.726543    3086 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:57:09.740576    3086 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:57:09.841011    3086 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:57:09.942212    3086 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:57:09.942291    3086 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:57:09.958029    3086 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:57:10.066774    3086 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:58:10.737469    3086 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1m0.670685753s)
	I0731 09:58:10.737530    3086 ssh_runner.go:195] Run: sudo journalctl --no-pager -u docker
	I0731 09:58:10.771925    3086 out.go:177] 
	W0731 09:58:10.808499    3086 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Jul 31 16:57:07 ha-393000-m04 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 16:57:07 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:07.614609547Z" level=info msg="Starting up"
	Jul 31 16:57:07 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:07.615069210Z" level=info msg="containerd not running, starting managed containerd"
	Jul 31 16:57:07 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:07.615686671Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=521
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.634118675Z" level=info msg="starting containerd" revision=8fc6bcff51318944179630522a095cc9dbf9f353 version=v1.7.20
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.648813739Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.648905647Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.648987668Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649023543Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649115573Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649153677Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649310821Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649351351Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649386967Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649417098Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649504924Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649741589Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651321793Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651374942Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651514496Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651561852Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651656435Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651726261Z" level=info msg="metadata content store policy set" policy=shared
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654123262Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654204695Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654251251Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654350067Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654393409Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654487591Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654734962Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654883200Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654935119Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654995034Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655049761Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655110876Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655152322Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655191037Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655226343Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655259366Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655291253Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655328129Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655366982Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655401989Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655433366Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655467277Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655499862Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655538340Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655582009Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655617049Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655648067Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655715866Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655752410Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655782374Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655814843Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655847271Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655882583Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655915529Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655955383Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656029336Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656073199Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656164066Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656205255Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656236085Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656267066Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656299327Z" level=info msg="NRI interface is disabled by configuration."
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656507264Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656594759Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656655016Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656736183Z" level=info msg="containerd successfully booted in 0.023392s"
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.637844374Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.647290585Z" level=info msg="Loading containers: start."
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.730204042Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.814059447Z" level=info msg="Loading containers: done."
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.822932218Z" level=info msg="Docker daemon" commit=cc13f95 containerd-snapshotter=false storage-driver=overlay2 version=27.1.1
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.823023957Z" level=info msg="Daemon has completed initialization"
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.850070242Z" level=info msg="API listen on /var/run/docker.sock"
	Jul 31 16:57:08 ha-393000-m04 systemd[1]: Started Docker Application Container Engine.
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.850121607Z" level=info msg="API listen on [::]:2376"
	Jul 31 16:57:09 ha-393000-m04 systemd[1]: Stopping Docker Application Container Engine...
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.815841889Z" level=info msg="Processing signal 'terminated'"
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.817678531Z" level=info msg="Daemon shutdown complete"
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.817761622Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=moby
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.817780933Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.817930316Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Jul 31 16:57:10 ha-393000-m04 systemd[1]: docker.service: Deactivated successfully.
	Jul 31 16:57:10 ha-393000-m04 systemd[1]: Stopped Docker Application Container Engine.
	Jul 31 16:57:10 ha-393000-m04 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 16:57:10 ha-393000-m04 dockerd[908]: time="2024-07-31T16:57:10.850183994Z" level=info msg="Starting up"
	Jul 31 16:58:10 ha-393000-m04 dockerd[908]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Jul 31 16:58:10 ha-393000-m04 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Jul 31 16:58:10 ha-393000-m04 systemd[1]: docker.service: Failed with result 'exit-code'.
	Jul 31 16:58:10 ha-393000-m04 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Jul 31 16:57:07 ha-393000-m04 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 16:57:07 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:07.614609547Z" level=info msg="Starting up"
	Jul 31 16:57:07 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:07.615069210Z" level=info msg="containerd not running, starting managed containerd"
	Jul 31 16:57:07 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:07.615686671Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=521
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.634118675Z" level=info msg="starting containerd" revision=8fc6bcff51318944179630522a095cc9dbf9f353 version=v1.7.20
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.648813739Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.648905647Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.648987668Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649023543Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649115573Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649153677Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649310821Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649351351Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649386967Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649417098Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649504924Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.649741589Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651321793Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651374942Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651514496Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651561852Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651656435Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.651726261Z" level=info msg="metadata content store policy set" policy=shared
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654123262Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654204695Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654251251Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654350067Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654393409Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654487591Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654734962Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654883200Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654935119Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.654995034Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655049761Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655110876Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655152322Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655191037Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655226343Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655259366Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655291253Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655328129Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655366982Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655401989Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655433366Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655467277Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655499862Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655538340Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655582009Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655617049Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655648067Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655715866Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655752410Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655782374Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655814843Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655847271Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655882583Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655915529Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.655955383Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656029336Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656073199Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656164066Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656205255Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656236085Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656267066Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656299327Z" level=info msg="NRI interface is disabled by configuration."
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656507264Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656594759Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656655016Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Jul 31 16:57:07 ha-393000-m04 dockerd[521]: time="2024-07-31T16:57:07.656736183Z" level=info msg="containerd successfully booted in 0.023392s"
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.637844374Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.647290585Z" level=info msg="Loading containers: start."
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.730204042Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.814059447Z" level=info msg="Loading containers: done."
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.822932218Z" level=info msg="Docker daemon" commit=cc13f95 containerd-snapshotter=false storage-driver=overlay2 version=27.1.1
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.823023957Z" level=info msg="Daemon has completed initialization"
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.850070242Z" level=info msg="API listen on /var/run/docker.sock"
	Jul 31 16:57:08 ha-393000-m04 systemd[1]: Started Docker Application Container Engine.
	Jul 31 16:57:08 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:08.850121607Z" level=info msg="API listen on [::]:2376"
	Jul 31 16:57:09 ha-393000-m04 systemd[1]: Stopping Docker Application Container Engine...
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.815841889Z" level=info msg="Processing signal 'terminated'"
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.817678531Z" level=info msg="Daemon shutdown complete"
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.817761622Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=moby
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.817780933Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Jul 31 16:57:09 ha-393000-m04 dockerd[515]: time="2024-07-31T16:57:09.817930316Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Jul 31 16:57:10 ha-393000-m04 systemd[1]: docker.service: Deactivated successfully.
	Jul 31 16:57:10 ha-393000-m04 systemd[1]: Stopped Docker Application Container Engine.
	Jul 31 16:57:10 ha-393000-m04 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 16:57:10 ha-393000-m04 dockerd[908]: time="2024-07-31T16:57:10.850183994Z" level=info msg="Starting up"
	Jul 31 16:58:10 ha-393000-m04 dockerd[908]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Jul 31 16:58:10 ha-393000-m04 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Jul 31 16:58:10 ha-393000-m04 systemd[1]: docker.service: Failed with result 'exit-code'.
	Jul 31 16:58:10 ha-393000-m04 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	W0731 09:58:10.808558    3086 out.go:239] * 
	* 
	W0731 09:58:10.810876    3086 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 09:58:10.863628    3086 out.go:177] 

                                                
                                                
** /stderr **
ha_test.go:230: failed to add worker node to current ha (multi-control plane) cluster. args "out/minikube-darwin-amd64 node add -p ha-393000 -v=7 --alsologtostderr" : exit status 90
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:244: <<< TestMultiControlPlane/serial/AddWorkerNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/AddWorkerNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (2.339656014s)
helpers_test.go:252: TestMultiControlPlane/serial/AddWorkerNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| image   | functional-680000 image build -t     | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:52 PDT | 31 Jul 24 09:52 PDT |
	|         | localhost/my-image:functional-680000 |                   |         |         |                     |                     |
	|         | testdata/build --alsologtostderr     |                   |         |         |                     |                     |
	| image   | functional-680000 image ls           | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:52 PDT | 31 Jul 24 09:52 PDT |
	| delete  | -p functional-680000                 | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:53 PDT | 31 Jul 24 09:53 PDT |
	| start   | -p ha-393000 --wait=true             | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:53 PDT | 31 Jul 24 09:56 PDT |
	|         | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|         | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|         | --driver=hyperkit                    |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- apply -f             | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- rollout status       | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 09:53:16
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 09:53:16.140722    2954 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:53:16.140891    2954 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:53:16.140897    2954 out.go:304] Setting ErrFile to fd 2...
	I0731 09:53:16.140901    2954 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:53:16.141085    2954 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:53:16.142669    2954 out.go:298] Setting JSON to false
	I0731 09:53:16.166361    2954 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1366,"bootTime":1722443430,"procs":467,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:53:16.166460    2954 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:53:16.192371    2954 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 09:53:16.233499    2954 notify.go:220] Checking for updates...
	I0731 09:53:16.263444    2954 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 09:53:16.328756    2954 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:53:16.398694    2954 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:53:16.420465    2954 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:53:16.443406    2954 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:53:16.464565    2954 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 09:53:16.486871    2954 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:53:16.517461    2954 out.go:177] * Using the hyperkit driver based on user configuration
	I0731 09:53:16.559490    2954 start.go:297] selected driver: hyperkit
	I0731 09:53:16.559519    2954 start.go:901] validating driver "hyperkit" against <nil>
	I0731 09:53:16.559538    2954 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 09:53:16.563960    2954 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:53:16.564071    2954 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 09:53:16.572413    2954 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 09:53:16.576399    2954 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:53:16.576420    2954 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 09:53:16.576454    2954 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 09:53:16.576646    2954 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:53:16.576708    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:16.576719    2954 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0731 09:53:16.576725    2954 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0731 09:53:16.576791    2954 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docke
r CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0
GPUs: AutoPauseInterval:1m0s}
	I0731 09:53:16.576877    2954 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:53:16.619419    2954 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 09:53:16.640390    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:53:16.640480    2954 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 09:53:16.640509    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:53:16.640712    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:53:16.640731    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:53:16.641227    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:53:16.641275    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json: {Name:mka52f595799559e261228b691f11b60413ee780 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:16.641876    2954 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:53:16.641986    2954 start.go:364] duration metric: took 90.888µs to acquireMachinesLock for "ha-393000"
	I0731 09:53:16.642025    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType
:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:53:16.642108    2954 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 09:53:16.663233    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:53:16.663389    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:53:16.663426    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:53:16.672199    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51037
	I0731 09:53:16.672559    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:53:16.672976    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:53:16.672987    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:53:16.673241    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:53:16.673369    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:16.673473    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:16.673584    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:53:16.673605    2954 client.go:168] LocalClient.Create starting
	I0731 09:53:16.673642    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:53:16.673693    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:53:16.673710    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:53:16.673763    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:53:16.673801    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:53:16.673815    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:53:16.673840    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:53:16.673850    2954 main.go:141] libmachine: (ha-393000) Calling .PreCreateCheck
	I0731 09:53:16.673929    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:16.674073    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:16.684622    2954 main.go:141] libmachine: Creating machine...
	I0731 09:53:16.684647    2954 main.go:141] libmachine: (ha-393000) Calling .Create
	I0731 09:53:16.684806    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:16.685170    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.684943    2962 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:53:16.685305    2954 main.go:141] libmachine: (ha-393000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:53:16.866642    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.866533    2962 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa...
	I0731 09:53:16.907777    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.907707    2962 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk...
	I0731 09:53:16.907795    2954 main.go:141] libmachine: (ha-393000) DBG | Writing magic tar header
	I0731 09:53:16.907815    2954 main.go:141] libmachine: (ha-393000) DBG | Writing SSH key tar header
	I0731 09:53:16.908296    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.908249    2962 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000 ...
	I0731 09:53:17.278530    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:17.278549    2954 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 09:53:17.278657    2954 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 09:53:17.388690    2954 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 09:53:17.388709    2954 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:53:17.388758    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:53:17.388793    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:53:17.388830    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:53:17.388871    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:53:17.388884    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:53:17.391787    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Pid is 2965
	I0731 09:53:17.392177    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 09:53:17.392188    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:17.392264    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:17.393257    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:17.393317    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:17.393342    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:17.393359    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:17.393369    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:17.399449    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:53:17.451566    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:53:17.452146    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:53:17.452168    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:53:17.452176    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:53:17.452184    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:53:17.832667    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:53:17.832680    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:53:17.947165    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:53:17.947181    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:53:17.947203    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:53:17.947214    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:53:17.948083    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:53:17.948094    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:53:19.393474    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 1
	I0731 09:53:19.393491    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:19.393544    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:19.394408    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:19.394431    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:19.394439    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:19.394449    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:19.394461    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:21.396273    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 2
	I0731 09:53:21.396290    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:21.396404    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:21.397210    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:21.397262    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:21.397275    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:21.397283    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:21.397292    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:23.397619    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 3
	I0731 09:53:23.397635    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:23.397733    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:23.398576    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:23.398585    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:23.398595    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:23.398604    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:23.398623    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:23.511265    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 09:53:23.511317    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 09:53:23.511327    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 09:53:23.534471    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 09:53:25.399722    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 4
	I0731 09:53:25.399735    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:25.399799    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:25.400596    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:25.400655    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:25.400665    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:25.400672    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:25.400681    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:27.400848    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 5
	I0731 09:53:27.400872    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:27.400976    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:27.401778    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:27.401824    2954 main.go:141] libmachine: (ha-393000) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:53:27.401836    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:53:27.401845    2954 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 09:53:27.401856    2954 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 09:53:27.401921    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:27.402530    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:27.402623    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:27.402706    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:53:27.402714    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:53:27.402795    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:27.402846    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:27.403621    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:53:27.403635    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:53:27.403641    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:53:27.403647    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:27.403727    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:27.403804    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:27.403889    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:27.403968    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:27.404083    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:27.404258    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:27.404265    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:53:28.471124    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:53:28.471139    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:53:28.471151    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.471303    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.471413    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.471516    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.471604    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.471751    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.471894    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.471902    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:53:28.534700    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:53:28.534755    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:53:28.534761    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:53:28.534766    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.534914    2954 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 09:53:28.534924    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.535023    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.535122    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.535205    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.535305    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.535404    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.535525    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.535678    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.535686    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 09:53:28.612223    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 09:53:28.612243    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.612383    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.612495    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.612585    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.612692    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.612835    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.612989    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.613000    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:53:28.684692    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:53:28.684711    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:53:28.684731    2954 buildroot.go:174] setting up certificates
	I0731 09:53:28.684742    2954 provision.go:84] configureAuth start
	I0731 09:53:28.684753    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.684892    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:28.684986    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.685097    2954 provision.go:143] copyHostCerts
	I0731 09:53:28.685132    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:53:28.685202    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:53:28.685210    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:53:28.685348    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:53:28.685544    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:53:28.685575    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:53:28.685580    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:53:28.685671    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:53:28.685817    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:53:28.685858    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:53:28.685863    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:53:28.685947    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:53:28.686099    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 09:53:28.975770    2954 provision.go:177] copyRemoteCerts
	I0731 09:53:28.975860    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:53:28.975879    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.976044    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.976151    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.976253    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.976368    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:29.014295    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:53:29.014364    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0731 09:53:29.033836    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:53:29.033901    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:53:29.053674    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:53:29.053744    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:53:29.073245    2954 provision.go:87] duration metric: took 388.494938ms to configureAuth
	I0731 09:53:29.073258    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:53:29.073388    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:53:29.073402    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:29.073538    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.073618    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.073712    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.073794    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.073871    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.073977    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.074114    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.074121    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:53:29.138646    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:53:29.138660    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:53:29.138727    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:53:29.138739    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.138887    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.138979    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.139070    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.139173    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.139333    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.139499    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.139544    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:53:29.214149    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:53:29.214180    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.214320    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.214403    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.214495    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.214599    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.214718    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.214856    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.214868    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:53:30.823417    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:53:30.823433    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:53:30.823439    2954 main.go:141] libmachine: (ha-393000) Calling .GetURL
	I0731 09:53:30.823574    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:53:30.823582    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:53:30.823587    2954 client.go:171] duration metric: took 14.150104113s to LocalClient.Create
	I0731 09:53:30.823598    2954 start.go:167] duration metric: took 14.150148374s to libmachine.API.Create "ha-393000"
	I0731 09:53:30.823607    2954 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 09:53:30.823621    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:53:30.823633    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.823781    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:53:30.823793    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.823880    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.823974    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.824065    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.824160    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:30.868545    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:53:30.872572    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:53:30.872587    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:53:30.872696    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:53:30.872889    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:53:30.872896    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:53:30.873123    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:53:30.890087    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:53:30.911977    2954 start.go:296] duration metric: took 88.361428ms for postStartSetup
	I0731 09:53:30.912003    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:30.912600    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:30.912759    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:53:30.913103    2954 start.go:128] duration metric: took 14.271109881s to createHost
	I0731 09:53:30.913117    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.913201    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.913305    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.913399    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.913473    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.913588    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:30.913703    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:30.913711    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:53:30.978737    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444810.120322538
	
	I0731 09:53:30.978750    2954 fix.go:216] guest clock: 1722444810.120322538
	I0731 09:53:30.978755    2954 fix.go:229] Guest: 2024-07-31 09:53:30.120322538 -0700 PDT Remote: 2024-07-31 09:53:30.913111 -0700 PDT m=+14.813015151 (delta=-792.788462ms)
	I0731 09:53:30.978778    2954 fix.go:200] guest clock delta is within tolerance: -792.788462ms
	I0731 09:53:30.978783    2954 start.go:83] releasing machines lock for "ha-393000", held for 14.336915594s
	I0731 09:53:30.978805    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.978937    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:30.979046    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979390    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979496    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979591    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:53:30.979625    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.979645    2954 ssh_runner.go:195] Run: cat /version.json
	I0731 09:53:30.979655    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.979750    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.979786    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.979846    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.979902    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.979927    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.979985    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.980003    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:30.980063    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:31.061693    2954 ssh_runner.go:195] Run: systemctl --version
	I0731 09:53:31.066472    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 09:53:31.070647    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:53:31.070687    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:53:31.084420    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:53:31.084432    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:53:31.084539    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:53:31.099368    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:53:31.108753    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:53:31.117896    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:53:31.117944    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:53:31.126974    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:53:31.135823    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:53:31.144673    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:53:31.153676    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:53:31.162890    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:53:31.171995    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:53:31.181357    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:53:31.190300    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:53:31.198317    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:53:31.206286    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:31.306658    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:53:31.325552    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:53:31.325643    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:53:31.346571    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:53:31.359753    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:53:31.393299    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:53:31.404448    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:53:31.414860    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:53:31.437636    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:53:31.448198    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:53:31.464071    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:53:31.467113    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:53:31.474646    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:53:31.488912    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:53:31.589512    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:53:31.693775    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:53:31.693845    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:53:31.709549    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:31.811094    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:53:34.149023    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.337932224s)
	I0731 09:53:34.149088    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:53:34.161198    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:53:34.175766    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:53:34.187797    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:53:34.283151    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:53:34.377189    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:34.469067    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:53:34.482248    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:53:34.492385    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:34.587912    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:53:34.647834    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:53:34.647904    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:53:34.652204    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:53:34.652250    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:53:34.655108    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:53:34.680326    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:53:34.680403    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:53:34.699387    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:53:34.764313    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:53:34.764369    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:34.764763    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:53:34.769523    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:53:34.780319    2954 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Moun
tType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 09:53:34.780379    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:53:34.780438    2954 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 09:53:34.792271    2954 docker.go:685] Got preloaded images: 
	I0731 09:53:34.792283    2954 docker.go:691] registry.k8s.io/kube-apiserver:v1.30.3 wasn't preloaded
	I0731 09:53:34.792332    2954 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0731 09:53:34.800298    2954 ssh_runner.go:195] Run: which lz4
	I0731 09:53:34.803039    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0731 09:53:34.803157    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0731 09:53:34.806121    2954 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0731 09:53:34.806135    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (359612007 bytes)
	I0731 09:53:35.858525    2954 docker.go:649] duration metric: took 1.055419334s to copy over tarball
	I0731 09:53:35.858591    2954 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0731 09:53:38.196952    2954 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.338365795s)
	I0731 09:53:38.196967    2954 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0731 09:53:38.223533    2954 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0731 09:53:38.232307    2954 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2630 bytes)
	I0731 09:53:38.245888    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:38.355987    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:53:40.705059    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.349073816s)
	I0731 09:53:40.705149    2954 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 09:53:40.718481    2954 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0731 09:53:40.718506    2954 cache_images.go:84] Images are preloaded, skipping loading
	I0731 09:53:40.718529    2954 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 09:53:40.718621    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:53:40.718689    2954 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 09:53:40.756905    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:40.756918    2954 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0731 09:53:40.756931    2954 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 09:53:40.756946    2954 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 09:53:40.757028    2954 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 09:53:40.757045    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:53:40.757094    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:53:40.770142    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:53:40.770212    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:53:40.770264    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:53:40.778467    2954 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 09:53:40.778510    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 09:53:40.786404    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 09:53:40.799629    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:53:40.814270    2954 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 09:53:40.827819    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1446 bytes)
	I0731 09:53:40.841352    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:53:40.844280    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:53:40.854288    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:40.961875    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:53:40.976988    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 09:53:40.977000    2954 certs.go:194] generating shared ca certs ...
	I0731 09:53:40.977011    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:40.977205    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:53:40.977278    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:53:40.977287    2954 certs.go:256] generating profile certs ...
	I0731 09:53:40.977331    2954 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:53:40.977344    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt with IP's: []
	I0731 09:53:41.064733    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt ...
	I0731 09:53:41.064749    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt: {Name:mk11f8b5ec16b878c9f692ccaff9a489ecc76fb2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.065074    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key ...
	I0731 09:53:41.065082    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key: {Name:mk18e6554cf3c807804faf77a7a9620e92860212 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.065322    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9
	I0731 09:53:41.065337    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.254]
	I0731 09:53:41.267360    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 ...
	I0731 09:53:41.267375    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9: {Name:mk9c13a9d071c94395118e1f00f992954683ef5b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.267745    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9 ...
	I0731 09:53:41.267755    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9: {Name:mk49f9f4ab2c1350a3cdb49ded7d6cffd5f069e5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.267965    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:53:41.268145    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:53:41.268307    2954 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:53:41.268320    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt with IP's: []
	I0731 09:53:41.352486    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt ...
	I0731 09:53:41.352499    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt: {Name:mk6759a3c690d7a9e990f65c338d22538c5b127a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.352775    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key ...
	I0731 09:53:41.352788    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key: {Name:mk4f661b46725a943b9862deb5f02f250855a1b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.352992    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:53:41.353021    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:53:41.353040    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:53:41.353059    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:53:41.353078    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:53:41.353096    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:53:41.353115    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:53:41.353132    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:53:41.353229    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:53:41.353280    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:53:41.353289    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:53:41.353319    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:53:41.353348    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:53:41.353377    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:53:41.353444    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:53:41.353475    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.353494    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.353511    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.353950    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:53:41.373611    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:53:41.392573    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:53:41.412520    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:53:41.433349    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0731 09:53:41.452365    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0731 09:53:41.472032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:53:41.491092    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:53:41.510282    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:53:41.529242    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:53:41.549127    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:53:41.568112    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 09:53:41.581548    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:53:41.585729    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:53:41.594979    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.598924    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.598977    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.603300    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:53:41.612561    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:53:41.621665    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.624970    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.625005    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.629117    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:53:41.638283    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:53:41.647422    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.650741    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.650776    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.654995    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:53:41.664976    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:53:41.668030    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:53:41.668072    2954 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTy
pe:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:53:41.668156    2954 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 09:53:41.680752    2954 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 09:53:41.691788    2954 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0731 09:53:41.701427    2954 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0731 09:53:41.710462    2954 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0731 09:53:41.710473    2954 kubeadm.go:157] found existing configuration files:
	
	I0731 09:53:41.710522    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0731 09:53:41.718051    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0731 09:53:41.718109    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0731 09:53:41.726696    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0731 09:53:41.737698    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0731 09:53:41.737751    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0731 09:53:41.745907    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0731 09:53:41.753641    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0731 09:53:41.753680    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0731 09:53:41.761450    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0731 09:53:41.769156    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0731 09:53:41.769207    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0731 09:53:41.777068    2954 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0731 09:53:41.848511    2954 kubeadm.go:310] [init] Using Kubernetes version: v1.30.3
	I0731 09:53:41.848564    2954 kubeadm.go:310] [preflight] Running pre-flight checks
	I0731 09:53:41.937481    2954 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0731 09:53:41.937568    2954 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0731 09:53:41.937658    2954 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0731 09:53:42.093209    2954 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0731 09:53:42.137661    2954 out.go:204]   - Generating certificates and keys ...
	I0731 09:53:42.137715    2954 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0731 09:53:42.137758    2954 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0731 09:53:42.784132    2954 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0731 09:53:42.954915    2954 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0731 09:53:43.064099    2954 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0731 09:53:43.107145    2954 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0731 09:53:43.256550    2954 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0731 09:53:43.256643    2954 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-393000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0731 09:53:43.365808    2954 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0731 09:53:43.365910    2954 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-393000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0731 09:53:43.496987    2954 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0731 09:53:43.811530    2954 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0731 09:53:43.998883    2954 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0731 09:53:43.999156    2954 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0731 09:53:44.246352    2954 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0731 09:53:44.460463    2954 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0731 09:53:44.552236    2954 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0731 09:53:44.656335    2954 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0731 09:53:44.920852    2954 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0731 09:53:44.921188    2954 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0731 09:53:44.922677    2954 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0731 09:53:44.944393    2954 out.go:204]   - Booting up control plane ...
	I0731 09:53:44.944462    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0731 09:53:44.944530    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0731 09:53:44.944583    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0731 09:53:44.944663    2954 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0731 09:53:44.944728    2954 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0731 09:53:44.944759    2954 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0731 09:53:45.048317    2954 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0731 09:53:45.048393    2954 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0731 09:53:45.548165    2954 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 500.802272ms
	I0731 09:53:45.548224    2954 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0731 09:53:51.610602    2954 kubeadm.go:310] [api-check] The API server is healthy after 6.066816222s
	I0731 09:53:51.618854    2954 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0731 09:53:51.625868    2954 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0731 09:53:51.637830    2954 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0731 09:53:51.637998    2954 kubeadm.go:310] [mark-control-plane] Marking the node ha-393000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0731 09:53:51.650953    2954 kubeadm.go:310] [bootstrap-token] Using token: wt4o9v.66pnb4w7anxpqs79
	I0731 09:53:51.687406    2954 out.go:204]   - Configuring RBAC rules ...
	I0731 09:53:51.687587    2954 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0731 09:53:51.690002    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0731 09:53:51.716618    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0731 09:53:51.718333    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0731 09:53:51.720211    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0731 09:53:51.722003    2954 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0731 09:53:52.016537    2954 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0731 09:53:52.431449    2954 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0731 09:53:53.015675    2954 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0731 09:53:53.016431    2954 kubeadm.go:310] 
	I0731 09:53:53.016524    2954 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0731 09:53:53.016539    2954 kubeadm.go:310] 
	I0731 09:53:53.016612    2954 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0731 09:53:53.016623    2954 kubeadm.go:310] 
	I0731 09:53:53.016649    2954 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0731 09:53:53.016721    2954 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0731 09:53:53.016763    2954 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0731 09:53:53.016773    2954 kubeadm.go:310] 
	I0731 09:53:53.016814    2954 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0731 09:53:53.016821    2954 kubeadm.go:310] 
	I0731 09:53:53.016868    2954 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0731 09:53:53.016891    2954 kubeadm.go:310] 
	I0731 09:53:53.016935    2954 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0731 09:53:53.017005    2954 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0731 09:53:53.017059    2954 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0731 09:53:53.017072    2954 kubeadm.go:310] 
	I0731 09:53:53.017139    2954 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0731 09:53:53.017203    2954 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0731 09:53:53.017207    2954 kubeadm.go:310] 
	I0731 09:53:53.017269    2954 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token wt4o9v.66pnb4w7anxpqs79 \
	I0731 09:53:53.017353    2954 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 \
	I0731 09:53:53.017373    2954 kubeadm.go:310] 	--control-plane 
	I0731 09:53:53.017381    2954 kubeadm.go:310] 
	I0731 09:53:53.017452    2954 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0731 09:53:53.017461    2954 kubeadm.go:310] 
	I0731 09:53:53.017528    2954 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token wt4o9v.66pnb4w7anxpqs79 \
	I0731 09:53:53.017610    2954 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 
	I0731 09:53:53.018224    2954 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0731 09:53:53.018239    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:53.018245    2954 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0731 09:53:53.040097    2954 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0731 09:53:53.097376    2954 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0731 09:53:53.101992    2954 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.3/kubectl ...
	I0731 09:53:53.102004    2954 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0731 09:53:53.115926    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0731 09:53:53.335699    2954 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0731 09:53:53.335768    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:53.335769    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000 minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=true
	I0731 09:53:53.489955    2954 ops.go:34] apiserver oom_adj: -16
	I0731 09:53:53.490022    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:53.990085    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:54.490335    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:54.991422    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:55.490608    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:55.990200    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:56.490175    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:56.990807    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:57.491373    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:57.991164    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:58.491587    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:58.990197    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:59.490119    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:59.990444    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:00.490776    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:00.990123    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:01.490685    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:01.991905    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:02.490505    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:02.990148    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:03.490590    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:03.990745    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:04.491071    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:04.991117    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:05.490027    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:05.576301    2954 kubeadm.go:1113] duration metric: took 12.240698872s to wait for elevateKubeSystemPrivileges
	I0731 09:54:05.576324    2954 kubeadm.go:394] duration metric: took 23.908471214s to StartCluster
	I0731 09:54:05.576346    2954 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:05.576441    2954 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:54:05.576993    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:05.577274    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0731 09:54:05.577286    2954 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:05.577302    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:54:05.577319    2954 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 09:54:05.577357    2954 addons.go:69] Setting storage-provisioner=true in profile "ha-393000"
	I0731 09:54:05.577363    2954 addons.go:69] Setting default-storageclass=true in profile "ha-393000"
	I0731 09:54:05.577386    2954 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-393000"
	I0731 09:54:05.577386    2954 addons.go:234] Setting addon storage-provisioner=true in "ha-393000"
	I0731 09:54:05.577408    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:05.577423    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:05.577661    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.577669    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.577675    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.577679    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.587150    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51060
	I0731 09:54:05.587233    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51061
	I0731 09:54:05.587573    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.587584    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.587918    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.587919    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.587930    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.587931    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.588210    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.588232    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.588358    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.588454    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.588531    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.588614    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.588639    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.590714    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:54:05.590994    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 09:54:05.591385    2954 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 09:54:05.591537    2954 addons.go:234] Setting addon default-storageclass=true in "ha-393000"
	I0731 09:54:05.591560    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:05.591783    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.591798    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.597469    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51064
	I0731 09:54:05.597830    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.598161    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.598171    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.598405    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.598520    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.598612    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.598688    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.599681    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:05.600339    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51066
	I0731 09:54:05.600677    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.601035    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.601051    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.601254    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.601611    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.601637    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.610207    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51068
	I0731 09:54:05.610548    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.610892    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.610909    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.611149    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.611266    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.611351    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.611421    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.612421    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:05.612552    2954 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0731 09:54:05.612560    2954 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0731 09:54:05.612568    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:05.612695    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:05.612786    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:05.612891    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:05.612974    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:05.623428    2954 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0731 09:54:05.644440    2954 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0731 09:54:05.644452    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0731 09:54:05.644468    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:05.644630    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:05.644723    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:05.644822    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:05.644921    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:05.653382    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0731 09:54:05.687318    2954 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0731 09:54:05.764200    2954 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0731 09:54:06.182319    2954 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0731 09:54:06.182364    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.182377    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.182560    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.182561    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.182572    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.182582    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.182588    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.182708    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.182715    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.182734    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.182830    2954 round_trippers.go:463] GET https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0731 09:54:06.182842    2954 round_trippers.go:469] Request Headers:
	I0731 09:54:06.182849    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:54:06.182854    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:54:06.189976    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:54:06.190422    2954 round_trippers.go:463] PUT https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0731 09:54:06.190430    2954 round_trippers.go:469] Request Headers:
	I0731 09:54:06.190435    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:54:06.190439    2954 round_trippers.go:473]     Content-Type: application/json
	I0731 09:54:06.190441    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:54:06.192143    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:54:06.192277    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.192285    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.192466    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.192478    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.192482    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318368    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.318380    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.318552    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.318557    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318564    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.318573    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.318591    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.318752    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318752    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.318769    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.354999    2954 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0731 09:54:06.412621    2954 addons.go:510] duration metric: took 835.314471ms for enable addons: enabled=[default-storageclass storage-provisioner]
	I0731 09:54:06.412653    2954 start.go:246] waiting for cluster config update ...
	I0731 09:54:06.412665    2954 start.go:255] writing updated cluster config ...
	I0731 09:54:06.449784    2954 out.go:177] 
	I0731 09:54:06.487284    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:06.487391    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:06.509688    2954 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 09:54:06.585678    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:54:06.585712    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:54:06.585911    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:54:06.585931    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:54:06.586023    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:06.586742    2954 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:54:06.586867    2954 start.go:364] duration metric: took 101.68µs to acquireMachinesLock for "ha-393000-m02"
	I0731 09:54:06.586897    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks
:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:06.586986    2954 start.go:125] createHost starting for "m02" (driver="hyperkit")
	I0731 09:54:06.608709    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:54:06.608788    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:06.608805    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:06.617299    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51073
	I0731 09:54:06.617638    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:06.618011    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:06.618029    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:06.618237    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:06.618326    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:06.618405    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:06.618514    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:54:06.618528    2954 client.go:168] LocalClient.Create starting
	I0731 09:54:06.618559    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:54:06.618609    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:54:06.618620    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:54:06.618668    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:54:06.618707    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:54:06.618717    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:54:06.618731    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:54:06.618737    2954 main.go:141] libmachine: (ha-393000-m02) Calling .PreCreateCheck
	I0731 09:54:06.618808    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:06.618841    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:06.646223    2954 main.go:141] libmachine: Creating machine...
	I0731 09:54:06.646236    2954 main.go:141] libmachine: (ha-393000-m02) Calling .Create
	I0731 09:54:06.646361    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:06.646520    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.646351    2979 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:54:06.646597    2954 main.go:141] libmachine: (ha-393000-m02) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:54:06.831715    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.831641    2979 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa...
	I0731 09:54:06.939142    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.939044    2979 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk...
	I0731 09:54:06.939162    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Writing magic tar header
	I0731 09:54:06.939170    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Writing SSH key tar header
	I0731 09:54:06.940042    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.939949    2979 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02 ...
	I0731 09:54:07.311809    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:07.311824    2954 main.go:141] libmachine: (ha-393000-m02) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 09:54:07.311866    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 09:54:07.337818    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 09:54:07.337835    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:54:07.337884    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:54:07.337912    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:54:07.337954    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:54:07.337986    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:54:07.338000    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:54:07.340860    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Pid is 2980
	I0731 09:54:07.341360    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 09:54:07.341374    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:07.341426    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:07.342343    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:07.342405    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:07.342418    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:07.342433    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:07.342443    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:07.342451    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:07.348297    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:54:07.357913    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:54:07.358688    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:54:07.358712    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:54:07.358723    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:54:07.358740    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:54:07.743017    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:54:07.743035    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:54:07.858034    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:54:07.858062    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:54:07.858072    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:54:07.858084    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:54:07.858884    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:54:07.858896    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:54:09.343775    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 1
	I0731 09:54:09.343792    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:09.343900    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:09.344720    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:09.344781    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:09.344792    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:09.344804    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:09.344817    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:09.344826    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:11.346829    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 2
	I0731 09:54:11.346846    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:11.346940    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:11.347752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:11.347766    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:11.347784    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:11.347795    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:11.347819    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:11.347832    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:13.348981    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 3
	I0731 09:54:13.349001    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:13.349109    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:13.349907    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:13.349943    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:13.349954    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:13.349965    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:13.349972    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:13.349980    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:13.459282    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:54:13.459342    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:54:13.459355    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:54:13.483197    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:54:15.351752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 4
	I0731 09:54:15.351769    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:15.351820    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:15.352675    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:15.352721    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:15.352735    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:15.352744    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:15.352752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:15.352760    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:17.353423    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 5
	I0731 09:54:17.353439    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:17.353530    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:17.354334    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:17.354363    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:54:17.354369    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:54:17.354392    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 09:54:17.354398    2954 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 09:54:17.354469    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:17.355226    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:17.355356    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:17.355457    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:54:17.355466    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:54:17.355564    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:17.355626    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:17.356407    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:54:17.356415    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:54:17.356426    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:54:17.356432    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:17.356529    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:17.356628    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:17.356727    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:17.356823    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:17.356939    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:17.357111    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:17.357118    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:54:18.376907    2954 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0731 09:54:21.440008    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:54:21.440021    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:54:21.440026    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.440157    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.440265    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.440360    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.440445    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.440567    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.440720    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.440728    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:54:21.502840    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:54:21.502894    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:54:21.502900    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:54:21.502905    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.503041    2954 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 09:54:21.503052    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.503150    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.503242    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.503322    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.503392    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.503473    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.503584    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.503728    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.503737    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 09:54:21.579730    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 09:54:21.579745    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.579874    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.579976    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.580070    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.580163    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.580287    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.580427    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.580439    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:54:21.651021    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:54:21.651038    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:54:21.651048    2954 buildroot.go:174] setting up certificates
	I0731 09:54:21.651054    2954 provision.go:84] configureAuth start
	I0731 09:54:21.651061    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.651192    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:21.651290    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.651382    2954 provision.go:143] copyHostCerts
	I0731 09:54:21.651408    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:54:21.651454    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:54:21.651459    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:54:21.651611    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:54:21.651812    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:54:21.651848    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:54:21.651853    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:54:21.651933    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:54:21.652069    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:54:21.652109    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:54:21.652114    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:54:21.652196    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:54:21.652337    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 09:54:21.695144    2954 provision.go:177] copyRemoteCerts
	I0731 09:54:21.695204    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:54:21.695225    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.695363    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.695457    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.695544    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.695616    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:21.734262    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:54:21.734338    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:54:21.760893    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:54:21.760979    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 09:54:21.787062    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:54:21.787131    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:54:21.807971    2954 provision.go:87] duration metric: took 156.910143ms to configureAuth
	I0731 09:54:21.807985    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:54:21.808123    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:21.808137    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:21.808270    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.808350    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.808427    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.808504    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.808592    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.808693    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.808822    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.808830    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:54:21.871923    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:54:21.871936    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:54:21.872014    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:54:21.872025    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.872159    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.872242    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.872339    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.872432    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.872558    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.872693    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.872741    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:54:21.947253    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:54:21.947272    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.947434    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.947533    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.947607    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.947689    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.947845    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.947990    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.948005    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:54:23.521299    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:54:23.521320    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:54:23.521327    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetURL
	I0731 09:54:23.521467    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:54:23.521475    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:54:23.521480    2954 client.go:171] duration metric: took 16.903099578s to LocalClient.Create
	I0731 09:54:23.521492    2954 start.go:167] duration metric: took 16.903132869s to libmachine.API.Create "ha-393000"
	I0731 09:54:23.521498    2954 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 09:54:23.521504    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:54:23.521519    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.521663    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:54:23.521677    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.521769    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.521859    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.521933    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.522032    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:23.560604    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:54:23.563782    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:54:23.563793    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:54:23.563892    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:54:23.564080    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:54:23.564086    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:54:23.564293    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:54:23.571517    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:54:23.591429    2954 start.go:296] duration metric: took 69.922656ms for postStartSetup
	I0731 09:54:23.591460    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:23.592068    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:23.592212    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:23.592596    2954 start.go:128] duration metric: took 17.005735325s to createHost
	I0731 09:54:23.592609    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.592713    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.592826    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.592928    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.593022    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.593148    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:23.593279    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:23.593287    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:54:23.656618    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444863.810880618
	
	I0731 09:54:23.656630    2954 fix.go:216] guest clock: 1722444863.810880618
	I0731 09:54:23.656635    2954 fix.go:229] Guest: 2024-07-31 09:54:23.810880618 -0700 PDT Remote: 2024-07-31 09:54:23.592602 -0700 PDT m=+67.492982270 (delta=218.278618ms)
	I0731 09:54:23.656654    2954 fix.go:200] guest clock delta is within tolerance: 218.278618ms
	I0731 09:54:23.656663    2954 start.go:83] releasing machines lock for "ha-393000-m02", held for 17.069938552s
	I0731 09:54:23.656681    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.656811    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:23.684522    2954 out.go:177] * Found network options:
	I0731 09:54:23.836571    2954 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 09:54:23.866932    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:54:23.866975    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.867861    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.868089    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.868209    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:54:23.868288    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 09:54:23.868332    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:54:23.868439    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 09:54:23.868462    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.868525    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.868708    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.868756    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.868922    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.868944    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.869058    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:23.869081    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.869206    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 09:54:23.904135    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:54:23.904205    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:54:23.927324    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:54:23.927338    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:54:23.927400    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:54:23.970222    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:54:23.978777    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:54:23.987481    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:54:23.987533    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:54:23.996430    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:54:24.004692    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:54:24.012968    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:54:24.021204    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:54:24.030482    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:54:24.038802    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:54:24.047006    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:54:24.055781    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:54:24.063050    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:54:24.072089    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:24.169406    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:54:24.189452    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:54:24.189519    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:54:24.202393    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:54:24.214821    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:54:24.229583    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:54:24.240171    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:54:24.250428    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:54:24.302946    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:54:24.313120    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:54:24.327912    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:54:24.331673    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:54:24.338902    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:54:24.352339    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:54:24.449032    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:54:24.557842    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:54:24.557870    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:54:24.571700    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:24.673137    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:54:27.047079    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.373944592s)
	I0731 09:54:27.047137    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:54:27.057410    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:54:27.071816    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:54:27.082278    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:54:27.176448    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:54:27.277016    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:27.384870    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:54:27.398860    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:54:27.409735    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:27.507837    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:54:27.568313    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:54:27.568381    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:54:27.573262    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:54:27.573320    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:54:27.579109    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:54:27.606116    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:54:27.606208    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:54:27.625621    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:54:27.663443    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:54:27.704938    2954 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 09:54:27.726212    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:27.726560    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:54:27.730336    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:54:27.740553    2954 mustload.go:65] Loading cluster: ha-393000
	I0731 09:54:27.740700    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:27.740921    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:27.740943    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:27.749667    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51097
	I0731 09:54:27.750028    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:27.750384    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:27.750401    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:27.750596    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:27.750732    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:27.750813    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:27.750888    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:27.751853    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:27.752094    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:27.752117    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:27.760565    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51099
	I0731 09:54:27.760882    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:27.761210    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:27.761223    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:27.761435    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:27.761551    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:27.761648    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 09:54:27.761653    2954 certs.go:194] generating shared ca certs ...
	I0731 09:54:27.761672    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.761836    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:54:27.761936    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:54:27.761945    2954 certs.go:256] generating profile certs ...
	I0731 09:54:27.762034    2954 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:54:27.762058    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069
	I0731 09:54:27.762073    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0731 09:54:27.834156    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 ...
	I0731 09:54:27.834169    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069: {Name:mk0062f228b9fa8374eba60d674a49cb0265b988 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.834495    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069 ...
	I0731 09:54:27.834504    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069: {Name:mkd62a5cca652a59908630fd95f20d2e01386237 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.834713    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:54:27.834929    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:54:27.835197    2954 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:54:27.835206    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:54:27.835229    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:54:27.835247    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:54:27.835267    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:54:27.835284    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:54:27.835302    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:54:27.835321    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:54:27.835338    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:54:27.835425    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:54:27.835473    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:54:27.835481    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:54:27.835511    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:54:27.835539    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:54:27.835575    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:54:27.835647    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:54:27.835682    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:54:27.835703    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:54:27.835723    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:27.835762    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:27.835910    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:27.836005    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:27.836102    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:27.836203    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:27.868754    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 09:54:27.872390    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 09:54:27.881305    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 09:54:27.884697    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 09:54:27.893772    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 09:54:27.896980    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 09:54:27.905593    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 09:54:27.908812    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 09:54:27.916605    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 09:54:27.919921    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 09:54:27.927985    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 09:54:27.931223    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 09:54:27.940238    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:54:27.960044    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:54:27.980032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:54:27.999204    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:54:28.018549    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0731 09:54:28.037848    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 09:54:28.057376    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:54:28.076776    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:54:28.096215    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:54:28.115885    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:54:28.135490    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:54:28.154907    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 09:54:28.169275    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 09:54:28.183001    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 09:54:28.196610    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 09:54:28.210320    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 09:54:28.223811    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 09:54:28.237999    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 09:54:28.251767    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:54:28.256201    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:54:28.265361    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.268834    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.268882    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.273194    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:54:28.282819    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:54:28.292122    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.295585    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.295622    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.299894    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:54:28.308965    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:54:28.318848    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.322347    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.322383    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.326657    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:54:28.335765    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:54:28.338885    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:54:28.338923    2954 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 09:54:28.338981    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:54:28.338998    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:54:28.339031    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:54:28.352962    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:54:28.353010    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:54:28.353068    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:54:28.361447    2954 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 09:54:28.361501    2954 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 09:54:28.370031    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet
	I0731 09:54:28.370031    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm
	I0731 09:54:28.370036    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl
	I0731 09:54:31.406224    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:54:31.406308    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:54:31.409804    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 09:54:31.409825    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 09:54:32.215163    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:54:32.215265    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:54:32.218832    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 09:54:32.218858    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 09:54:39.678084    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:54:39.690174    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:54:39.690295    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:54:39.693595    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 09:54:39.693614    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 09:54:39.964594    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 09:54:39.972786    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 09:54:39.986436    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:54:39.999856    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 09:54:40.013590    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:54:40.016608    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:54:40.026617    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:40.125738    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:54:40.142197    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:40.142482    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:40.142512    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:40.151352    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51126
	I0731 09:54:40.151710    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:40.152074    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:40.152091    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:40.152318    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:40.152428    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:40.152528    2954 start.go:317] joinCluster: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 Clu
sterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpira
tion:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:54:40.152603    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0731 09:54:40.152616    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:40.152722    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:40.152805    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:40.152933    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:40.153036    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:40.232831    2954 start.go:343] trying to join control-plane node "m02" to cluster: &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:40.232861    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token heh6bo.7n85cftszx0hevpy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443"
	I0731 09:55:07.963279    2954 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token heh6bo.7n85cftszx0hevpy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443": (27.730638671s)
	I0731 09:55:07.963316    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0731 09:55:08.368958    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000-m02 minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=false
	I0731 09:55:08.452570    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-393000-m02 node-role.kubernetes.io/control-plane:NoSchedule-
	I0731 09:55:08.540019    2954 start.go:319] duration metric: took 28.387749448s to joinCluster
	I0731 09:55:08.540065    2954 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:08.540296    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:08.563232    2954 out.go:177] * Verifying Kubernetes components...
	I0731 09:55:08.603726    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:08.841318    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:55:08.872308    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:55:08.872512    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 09:55:08.872555    2954 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 09:55:08.872732    2954 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 09:55:08.872795    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:08.872800    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:08.872806    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:08.872810    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:08.881842    2954 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 09:55:09.372875    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:09.372888    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:09.372894    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:09.372897    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:09.374975    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:09.872917    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:09.872929    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:09.872935    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:09.872939    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:09.875869    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.372943    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:10.372956    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:10.372964    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:10.372967    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:10.375041    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.874945    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:10.875020    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:10.875035    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:10.875043    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:10.877858    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.878307    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:11.373440    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:11.373461    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:11.373468    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:11.373472    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:11.376182    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:11.874612    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:11.874624    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:11.874630    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:11.874634    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:11.876432    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:12.374085    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:12.374098    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:12.374104    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:12.374107    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:12.376039    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:12.874234    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:12.874246    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:12.874252    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:12.874255    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:12.876210    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:13.374284    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:13.374372    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:13.374387    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:13.374396    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:13.377959    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:13.378403    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:13.873814    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:13.873839    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:13.873850    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:13.873856    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:13.876640    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:14.373497    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:14.373550    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:14.373561    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:14.373570    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:14.376681    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:14.872976    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:14.873065    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:14.873079    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:14.873087    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:14.875607    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:15.373684    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:15.373702    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:15.373711    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:15.373716    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:15.375839    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:15.873002    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:15.873028    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:15.873040    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:15.873049    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:15.876311    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:15.877408    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:16.373017    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:16.373044    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:16.373110    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:16.373119    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:16.376651    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:16.873932    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:16.873951    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:16.873958    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:16.873961    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:16.875945    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:17.372883    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:17.372963    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:17.372979    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:17.372987    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:17.375706    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:17.874312    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:17.874334    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:17.874343    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:17.874381    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:17.876575    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:18.374077    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:18.374176    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:18.374191    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:18.374197    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:18.377131    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:18.377505    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:18.874567    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:18.874589    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:18.874653    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:18.874658    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:18.877221    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:19.373331    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:19.373347    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:19.373387    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:19.373392    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:19.375412    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:19.873283    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:19.873307    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:19.873320    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:19.873326    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:19.876694    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.373050    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:20.373075    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:20.373086    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:20.373096    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:20.376371    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.874379    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:20.874402    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:20.874414    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:20.874421    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:20.877609    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.878167    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:21.373483    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:21.373509    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:21.373520    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:21.373526    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:21.376649    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:21.872794    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:21.872825    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:21.872832    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:21.872837    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:21.874864    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:22.373703    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:22.373721    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:22.373733    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:22.373739    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:22.376275    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:22.872731    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:22.872746    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:22.872752    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:22.872756    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:22.875078    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:23.373989    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:23.374007    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:23.374017    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:23.374021    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:23.376252    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:23.376876    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:23.874071    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:23.874095    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:23.874118    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:23.874128    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:23.877415    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:24.373797    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:24.373828    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:24.373836    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:24.373842    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:24.375723    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:24.873198    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:24.873217    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:24.873239    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:24.873242    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:24.874997    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:25.373864    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:25.373964    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:25.373983    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:25.373993    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:25.376940    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:25.377783    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:25.873066    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:25.873140    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:25.873157    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:25.873167    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:25.876035    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:26.373560    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:26.373582    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:26.373594    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:26.373600    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:26.376763    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:26.872802    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:26.872826    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:26.872847    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:26.872855    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:26.875665    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.372793    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.372848    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.372859    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.372865    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.375283    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.872817    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.872887    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.872897    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.872902    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.875143    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.875477    2954 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 09:55:27.875491    2954 node_ready.go:38] duration metric: took 19.002910931s for node "ha-393000-m02" to be "Ready" ...
	I0731 09:55:27.875498    2954 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:55:27.875539    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:27.875545    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.875550    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.875554    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.884028    2954 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 09:55:27.888275    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.888338    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 09:55:27.888344    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.888351    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.888354    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.895154    2954 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 09:55:27.895668    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.895676    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.895682    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.895685    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.903221    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:55:27.903585    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.903594    2954 pod_ready.go:81] duration metric: took 15.30431ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.903601    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.903644    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 09:55:27.903649    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.903655    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.903659    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.910903    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:55:27.911272    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.911279    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.911284    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.911287    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.912846    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.913176    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.913184    2954 pod_ready.go:81] duration metric: took 9.57768ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.913191    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.913223    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 09:55:27.913228    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.913233    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.913237    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.914947    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.915374    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.915380    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.915386    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.915390    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.916800    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.917134    2954 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.917142    2954 pod_ready.go:81] duration metric: took 3.945951ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.917148    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.917182    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 09:55:27.917186    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.917192    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.917199    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.919108    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.919519    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.919526    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.919532    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.919538    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.920909    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.921212    2954 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.921221    2954 pod_ready.go:81] duration metric: took 4.068426ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.921231    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.073440    2954 request.go:629] Waited for 152.136555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:55:28.073539    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:55:28.073547    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.073555    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.073561    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.075944    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:28.272878    2954 request.go:629] Waited for 196.473522ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:28.272966    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:28.272972    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.272978    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.272981    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.274914    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:28.275308    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:28.275318    2954 pod_ready.go:81] duration metric: took 354.084518ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.275325    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.473409    2954 request.go:629] Waited for 198.051207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:55:28.473441    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:55:28.473447    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.473463    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.473467    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.475323    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:28.673703    2954 request.go:629] Waited for 197.835098ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:28.673754    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:28.673765    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.673772    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.673777    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.676049    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:28.676485    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:28.676497    2954 pod_ready.go:81] duration metric: took 401.169334ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.676504    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.874899    2954 request.go:629] Waited for 198.343236ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:55:28.875005    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:55:28.875014    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.875025    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.875031    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.878371    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:29.072894    2954 request.go:629] Waited for 193.894527ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:29.072997    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:29.073009    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.073020    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.073029    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.075911    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.076354    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.076367    2954 pod_ready.go:81] duration metric: took 399.859987ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.076376    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.273708    2954 request.go:629] Waited for 197.294345ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:55:29.273758    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:55:29.273806    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.273815    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.273819    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.276500    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.473244    2954 request.go:629] Waited for 196.211404ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.473347    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.473355    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.473363    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.473367    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.475855    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.476256    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.476266    2954 pod_ready.go:81] duration metric: took 399.888458ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.476273    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.672987    2954 request.go:629] Waited for 196.670765ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:55:29.673094    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:55:29.673114    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.673128    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.673135    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.676240    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:29.874264    2954 request.go:629] Waited for 197.423472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.874348    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.874352    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.874365    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.874369    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.876229    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:29.876542    2954 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.876551    2954 pod_ready.go:81] duration metric: took 400.273525ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.876557    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.073985    2954 request.go:629] Waited for 197.386483ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:55:30.074064    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:55:30.074071    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.074076    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.074080    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.075934    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:30.274353    2954 request.go:629] Waited for 197.921759ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.274399    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.274408    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.274421    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.274429    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.276767    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:30.277075    2954 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:30.277085    2954 pod_ready.go:81] duration metric: took 400.525562ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.277092    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.474867    2954 request.go:629] Waited for 197.733458ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:55:30.474919    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:55:30.474936    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.474949    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.474958    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.478180    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:30.673620    2954 request.go:629] Waited for 194.924994ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.673658    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.673662    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.673668    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.673674    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.675356    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:30.675625    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:30.675634    2954 pod_ready.go:81] duration metric: took 398.539654ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.675640    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.873712    2954 request.go:629] Waited for 198.03899ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:55:30.873795    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:55:30.873801    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.873807    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.873811    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.875750    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:31.074152    2954 request.go:629] Waited for 197.932145ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:31.074207    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:31.074215    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.074227    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.074234    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.077132    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:31.077723    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:31.077735    2954 pod_ready.go:81] duration metric: took 402.091925ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:31.077744    2954 pod_ready.go:38] duration metric: took 3.202266702s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:55:31.077770    2954 api_server.go:52] waiting for apiserver process to appear ...
	I0731 09:55:31.077872    2954 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:55:31.089706    2954 api_server.go:72] duration metric: took 22.549827849s to wait for apiserver process to appear ...
	I0731 09:55:31.089719    2954 api_server.go:88] waiting for apiserver healthz status ...
	I0731 09:55:31.089735    2954 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:55:31.093731    2954 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:55:31.093774    2954 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 09:55:31.093779    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.093785    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.093789    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.094287    2954 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 09:55:31.094337    2954 api_server.go:141] control plane version: v1.30.3
	I0731 09:55:31.094346    2954 api_server.go:131] duration metric: took 4.622445ms to wait for apiserver health ...
	I0731 09:55:31.094351    2954 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 09:55:31.272834    2954 request.go:629] Waited for 178.447514ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.272864    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.272868    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.272874    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.272879    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.275929    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:31.278922    2954 system_pods.go:59] 17 kube-system pods found
	I0731 09:55:31.278939    2954 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:55:31.278943    2954 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:55:31.278948    2954 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:55:31.278951    2954 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:55:31.278954    2954 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:55:31.278957    2954 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:55:31.278960    2954 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:55:31.278963    2954 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:55:31.278966    2954 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:55:31.278968    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:55:31.278971    2954 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:55:31.278973    2954 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:55:31.278976    2954 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:55:31.278982    2954 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:55:31.278986    2954 system_pods.go:61] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:55:31.278988    2954 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:55:31.278991    2954 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:55:31.278996    2954 system_pods.go:74] duration metric: took 184.642078ms to wait for pod list to return data ...
	I0731 09:55:31.279002    2954 default_sa.go:34] waiting for default service account to be created ...
	I0731 09:55:31.473455    2954 request.go:629] Waited for 194.413647ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:55:31.473487    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:55:31.473492    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.473498    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.473502    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.475460    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:31.475608    2954 default_sa.go:45] found service account: "default"
	I0731 09:55:31.475618    2954 default_sa.go:55] duration metric: took 196.612794ms for default service account to be created ...
	I0731 09:55:31.475624    2954 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 09:55:31.673326    2954 request.go:629] Waited for 197.663631ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.673362    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.673369    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.673377    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.673384    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.676582    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:31.680023    2954 system_pods.go:86] 17 kube-system pods found
	I0731 09:55:31.680035    2954 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:55:31.680039    2954 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:55:31.680042    2954 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:55:31.680045    2954 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:55:31.680048    2954 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:55:31.680051    2954 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:55:31.680054    2954 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:55:31.680057    2954 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:55:31.680060    2954 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:55:31.680063    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:55:31.680067    2954 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:55:31.680070    2954 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:55:31.680073    2954 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:55:31.680076    2954 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:55:31.680079    2954 system_pods.go:89] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:55:31.680082    2954 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:55:31.680085    2954 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:55:31.680089    2954 system_pods.go:126] duration metric: took 204.462284ms to wait for k8s-apps to be running ...
	I0731 09:55:31.680093    2954 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 09:55:31.680137    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:55:31.691384    2954 system_svc.go:56] duration metric: took 11.279108ms WaitForService to wait for kubelet
	I0731 09:55:31.691399    2954 kubeadm.go:582] duration metric: took 23.151526974s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:55:31.691411    2954 node_conditions.go:102] verifying NodePressure condition ...
	I0731 09:55:31.872842    2954 request.go:629] Waited for 181.393446ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 09:55:31.872873    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 09:55:31.872877    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.872884    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.872887    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.875560    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:31.876076    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:55:31.876090    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:55:31.876101    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:55:31.876111    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:55:31.876115    2954 node_conditions.go:105] duration metric: took 184.70211ms to run NodePressure ...
	I0731 09:55:31.876123    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:55:31.876138    2954 start.go:255] writing updated cluster config ...
	I0731 09:55:31.896708    2954 out.go:177] 
	I0731 09:55:31.917824    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:31.917916    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:31.939502    2954 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 09:55:31.981501    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:55:31.981523    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:55:31.981705    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:55:31.981717    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:55:31.981841    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:31.982574    2954 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:55:31.982642    2954 start.go:364] duration metric: took 52.194µs to acquireMachinesLock for "ha-393000-m03"
	I0731 09:55:31.982663    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ing
ress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror:
DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m03 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:31.982776    2954 start.go:125] createHost starting for "m03" (driver="hyperkit")
	I0731 09:55:32.003523    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:55:32.003599    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:32.003626    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:32.012279    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51131
	I0731 09:55:32.012622    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:32.012991    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:32.013008    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:32.013225    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:32.013332    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:32.013417    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:32.013511    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:55:32.013531    2954 client.go:168] LocalClient.Create starting
	I0731 09:55:32.013562    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:55:32.013605    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:55:32.013616    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:55:32.013658    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:55:32.013685    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:55:32.013695    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:55:32.013708    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:55:32.013722    2954 main.go:141] libmachine: (ha-393000-m03) Calling .PreCreateCheck
	I0731 09:55:32.013796    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:32.013821    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:32.024803    2954 main.go:141] libmachine: Creating machine...
	I0731 09:55:32.024819    2954 main.go:141] libmachine: (ha-393000-m03) Calling .Create
	I0731 09:55:32.024954    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:32.025189    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.024948    2993 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:55:32.025311    2954 main.go:141] libmachine: (ha-393000-m03) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:55:32.387382    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.387300    2993 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa...
	I0731 09:55:32.468181    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.468125    2993 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk...
	I0731 09:55:32.468207    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Writing magic tar header
	I0731 09:55:32.468229    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Writing SSH key tar header
	I0731 09:55:32.468792    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.468762    2993 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03 ...
	I0731 09:55:33.078663    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:33.078680    2954 main.go:141] libmachine: (ha-393000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 09:55:33.078716    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 09:55:33.103258    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 09:55:33.103280    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:55:33.103347    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:55:33.103394    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:55:33.103443    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:55:33.103490    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:55:33.103507    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:55:33.106351    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Pid is 2994
	I0731 09:55:33.106790    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 09:55:33.106810    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:33.106894    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:33.107878    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:33.107923    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:33.107940    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:33.107959    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:33.107977    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:33.107995    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:33.108059    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:33.114040    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:55:33.122160    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:55:33.123003    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:55:33.123036    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:55:33.123053    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:55:33.123062    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:55:33.505461    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:55:33.505481    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:55:33.620173    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:55:33.620193    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:55:33.620213    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:55:33.620225    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:55:33.621055    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:55:33.621064    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:55:35.108561    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 1
	I0731 09:55:35.108578    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:35.108664    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:35.109476    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:35.109527    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:35.109535    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:35.109543    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:35.109553    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:35.109564    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:35.109588    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:37.111452    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 2
	I0731 09:55:37.111469    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:37.111534    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:37.112347    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:37.112387    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:37.112400    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:37.112409    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:37.112418    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:37.112431    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:37.112438    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:39.113861    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 3
	I0731 09:55:39.113876    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:39.113989    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:39.114793    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:39.114841    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:39.114854    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:39.114871    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:39.114881    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:39.114894    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:39.114910    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:39.197635    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:55:39.197744    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:55:39.197756    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:55:39.222062    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:55:41.116408    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 4
	I0731 09:55:41.116425    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:41.116529    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:41.117328    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:41.117368    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:41.117376    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:41.117399    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:41.117416    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:41.117425    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:41.117441    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:43.117722    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 5
	I0731 09:55:43.117737    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:43.117828    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:43.118651    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:43.118699    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:55:43.118714    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:55:43.118721    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 09:55:43.118726    2954 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 09:55:43.118795    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:43.119393    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:43.119491    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:43.119572    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:55:43.119580    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:55:43.119659    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:43.119724    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:43.120517    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:55:43.120525    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:55:43.120529    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:55:43.120540    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:43.120627    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:43.120733    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:43.120830    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:43.120937    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:43.121066    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:43.121248    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:43.121256    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:55:44.180872    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:55:44.180885    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:55:44.180891    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.181020    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.181119    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.181200    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.181293    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.181426    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.181579    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.181587    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:55:44.244214    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:55:44.244264    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:55:44.244271    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:55:44.244277    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.244401    2954 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 09:55:44.244413    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.244502    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.244591    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.244669    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.244754    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.244838    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.244957    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.245103    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.245112    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 09:55:44.315698    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 09:55:44.315714    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.315853    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.315950    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.316034    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.316117    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.316237    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.316383    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.316394    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:55:44.383039    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:55:44.383055    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:55:44.383064    2954 buildroot.go:174] setting up certificates
	I0731 09:55:44.383071    2954 provision.go:84] configureAuth start
	I0731 09:55:44.383077    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.383215    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:44.383314    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.383408    2954 provision.go:143] copyHostCerts
	I0731 09:55:44.383435    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:55:44.383482    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:55:44.383490    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:55:44.383608    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:55:44.383821    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:55:44.383853    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:55:44.383859    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:55:44.383930    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:55:44.384107    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:55:44.384137    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:55:44.384146    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:55:44.384214    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:55:44.384364    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 09:55:44.436199    2954 provision.go:177] copyRemoteCerts
	I0731 09:55:44.436250    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:55:44.436265    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.436405    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.436484    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.436578    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.436651    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:44.474166    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:55:44.474251    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:55:44.495026    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:55:44.495089    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:55:44.514528    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:55:44.514597    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:55:44.534382    2954 provision.go:87] duration metric: took 151.304295ms to configureAuth
	I0731 09:55:44.534397    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:55:44.534572    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:44.534587    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:44.534721    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.534815    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.534895    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.534982    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.535063    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.535176    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.535303    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.535311    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:55:44.595832    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:55:44.595845    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:55:44.595915    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:55:44.595926    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.596055    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.596141    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.596224    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.596312    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.596436    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.596585    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.596629    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:55:44.668428    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:55:44.668446    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.668587    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.668687    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.668775    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.668883    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.669009    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.669153    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.669165    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:55:46.245712    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:55:46.245728    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:55:46.245733    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetURL
	I0731 09:55:46.245877    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:55:46.245886    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:55:46.245891    2954 client.go:171] duration metric: took 14.176451747s to LocalClient.Create
	I0731 09:55:46.245904    2954 start.go:167] duration metric: took 14.176491485s to libmachine.API.Create "ha-393000"
	I0731 09:55:46.245910    2954 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 09:55:46.245917    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:55:46.245936    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.246092    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:55:46.246107    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.246216    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.246326    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.246431    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.246511    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:46.290725    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:55:46.294553    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:55:46.294567    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:55:46.294659    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:55:46.294805    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:55:46.294812    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:55:46.294995    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:55:46.303032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:55:46.335630    2954 start.go:296] duration metric: took 89.711926ms for postStartSetup
	I0731 09:55:46.335676    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:46.336339    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:46.336499    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:46.336864    2954 start.go:128] duration metric: took 14.298177246s to createHost
	I0731 09:55:46.336879    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.336971    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.337062    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.337141    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.337213    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.337332    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:46.337451    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:46.337458    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:55:46.398217    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444946.512017695
	
	I0731 09:55:46.398229    2954 fix.go:216] guest clock: 1722444946.512017695
	I0731 09:55:46.398235    2954 fix.go:229] Guest: 2024-07-31 09:55:46.512017695 -0700 PDT Remote: 2024-07-31 09:55:46.336873 -0700 PDT m=+150.181968458 (delta=175.144695ms)
	I0731 09:55:46.398245    2954 fix.go:200] guest clock delta is within tolerance: 175.144695ms
	I0731 09:55:46.398250    2954 start.go:83] releasing machines lock for "ha-393000-m03", held for 14.359697621s
	I0731 09:55:46.398269    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.398407    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:46.418329    2954 out.go:177] * Found network options:
	I0731 09:55:46.439149    2954 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 09:55:46.477220    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 09:55:46.477241    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:55:46.477255    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.477897    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.478058    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.478150    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:55:46.478196    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 09:55:46.478232    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 09:55:46.478262    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:55:46.478353    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 09:55:46.478353    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.478369    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.478511    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.478558    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.478670    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.478731    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.478785    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:46.478828    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.478931    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 09:55:46.512520    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:55:46.512591    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:55:46.558288    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:55:46.558305    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:55:46.558391    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:55:46.574105    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:55:46.582997    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:55:46.591920    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:55:46.591969    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:55:46.600962    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:55:46.610057    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:55:46.619019    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:55:46.627876    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:55:46.637129    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:55:46.646079    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:55:46.655162    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:55:46.664198    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:55:46.672256    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:55:46.680371    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:46.778919    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:55:46.798064    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:55:46.798132    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:55:46.815390    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:55:46.827644    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:55:46.842559    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:55:46.853790    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:55:46.864444    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:55:46.887653    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:55:46.898070    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:55:46.913256    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:55:46.916263    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:55:46.923424    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:55:46.937344    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:55:47.035092    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:55:47.134788    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:55:47.134810    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:55:47.149022    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:47.247660    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:55:49.540717    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.293040269s)
	I0731 09:55:49.540778    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:55:49.551148    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:55:49.563946    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:55:49.574438    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:55:49.675905    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:55:49.777958    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:49.889335    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:55:49.903338    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:55:49.914450    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:50.020127    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:55:50.079269    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:55:50.079351    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:55:50.085411    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:55:50.085468    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:55:50.088527    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:55:50.115874    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:55:50.115947    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:55:50.133371    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:55:50.177817    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:55:50.199409    2954 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 09:55:50.242341    2954 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 09:55:50.263457    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:50.263780    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:55:50.267924    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:55:50.277257    2954 mustload.go:65] Loading cluster: ha-393000
	I0731 09:55:50.277434    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:50.277675    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:50.277699    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:50.286469    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51154
	I0731 09:55:50.286803    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:50.287152    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:50.287174    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:50.287405    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:50.287529    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:55:50.287619    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:50.287687    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:55:50.288682    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:55:50.288947    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:50.288976    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:50.297641    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51156
	I0731 09:55:50.297976    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:50.298336    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:50.298356    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:50.298557    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:50.298695    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:55:50.298796    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 09:55:50.298803    2954 certs.go:194] generating shared ca certs ...
	I0731 09:55:50.298815    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.298953    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:55:50.299004    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:55:50.299013    2954 certs.go:256] generating profile certs ...
	I0731 09:55:50.299104    2954 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:55:50.299126    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 09:55:50.299146    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0731 09:55:50.438174    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb ...
	I0731 09:55:50.438189    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb: {Name:mk221449ac60933abd0b425ad947a6ab1580c0ce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.438543    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb ...
	I0731 09:55:50.438553    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb: {Name:mk1cb7896668e4a7a9edaf8893989143a67a7948 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.438773    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:55:50.438957    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:55:50.439187    2954 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:55:50.439201    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:55:50.439224    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:55:50.439243    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:55:50.439262    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:55:50.439280    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:55:50.439299    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:55:50.439317    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:55:50.439334    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:55:50.439423    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:55:50.439459    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:55:50.439466    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:55:50.439503    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:55:50.439532    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:55:50.439561    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:55:50.439623    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:55:50.439662    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.439683    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.439702    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.439730    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:55:50.439869    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:55:50.439971    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:55:50.440060    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:55:50.440149    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:55:50.470145    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 09:55:50.473304    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 09:55:50.482843    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 09:55:50.486120    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 09:55:50.495117    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 09:55:50.498266    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 09:55:50.507788    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 09:55:50.510913    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 09:55:50.519933    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 09:55:50.523042    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 09:55:50.531891    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 09:55:50.535096    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 09:55:50.544058    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:55:50.564330    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:55:50.585250    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:55:50.605412    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:55:50.625492    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1444 bytes)
	I0731 09:55:50.645935    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 09:55:50.666578    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:55:50.686734    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:55:50.707428    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:55:50.728977    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:55:50.749365    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:55:50.769217    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 09:55:50.782635    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 09:55:50.796452    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 09:55:50.810265    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 09:55:50.823856    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 09:55:50.837713    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 09:55:50.851806    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 09:55:50.865643    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:55:50.869985    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:55:50.878755    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.882092    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.882127    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.886361    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:55:50.894800    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:55:50.903511    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.906902    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.906941    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.911184    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:55:50.919457    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:55:50.927999    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.931344    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.931398    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.935641    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:55:50.944150    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:55:50.947330    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:55:50.947373    2954 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 09:55:50.947432    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:55:50.947450    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:55:50.947488    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:55:50.960195    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:55:50.960253    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:55:50.960307    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:55:50.968017    2954 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 09:55:50.968069    2954 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 09:55:50.975489    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 09:55:50.975509    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:55:50.975519    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:55:50.975557    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:55:50.976020    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:55:50.987294    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:55:50.987330    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 09:55:50.987350    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 09:55:50.987377    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 09:55:50.987399    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 09:55:50.987416    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:55:51.010057    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 09:55:51.010100    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 09:55:51.683575    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 09:55:51.690828    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 09:55:51.704403    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:55:51.718184    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 09:55:51.732058    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:55:51.735039    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:55:51.744606    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:51.842284    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:55:51.858313    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:55:51.858589    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:51.858612    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:51.867825    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51159
	I0731 09:55:51.868326    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:51.868657    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:51.868668    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:51.868882    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:51.868991    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:55:51.869077    2954 start.go:317] joinCluster: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 Clu
sterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:55:51.869219    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0731 09:55:51.869241    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:55:51.869330    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:55:51.869408    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:55:51.869497    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:55:51.869579    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:55:51.957634    2954 start.go:343] trying to join control-plane node "m03" to cluster: &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:51.957691    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 3o7i0i.qey1hcj8w6i3nuyy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443"
	I0731 09:56:20.527748    2954 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 3o7i0i.qey1hcj8w6i3nuyy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443": (28.570050327s)
	I0731 09:56:20.527779    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0731 09:56:20.987700    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000-m03 minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=false
	I0731 09:56:21.064233    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-393000-m03 node-role.kubernetes.io/control-plane:NoSchedule-
	I0731 09:56:21.148165    2954 start.go:319] duration metric: took 29.279096383s to joinCluster
	I0731 09:56:21.148219    2954 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:56:21.148483    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:56:21.189791    2954 out.go:177] * Verifying Kubernetes components...
	I0731 09:56:21.248129    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:56:21.485219    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:56:21.507788    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:56:21.508040    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 09:56:21.508088    2954 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 09:56:21.508300    2954 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 09:56:21.508342    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:21.508347    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:21.508353    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:21.508357    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:21.510586    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:22.008706    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:22.008723    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:22.008734    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:22.008738    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:22.010978    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:22.509350    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:22.509366    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:22.509372    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:22.509375    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:22.511656    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:23.009510    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:23.009526    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:23.009532    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:23.009535    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:23.011420    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:23.508500    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:23.508516    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:23.508523    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:23.508526    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:23.510720    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:23.511145    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:24.009377    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:24.009394    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:24.009439    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:24.009443    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:24.011828    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:24.509345    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:24.509361    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:24.509368    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:24.509372    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:24.511614    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:25.009402    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:25.009418    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:25.009424    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:25.009428    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:25.011344    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:25.508774    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:25.508790    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:25.508797    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:25.508800    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:25.510932    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:25.511292    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:26.008449    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:26.008465    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:26.008471    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:26.008474    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:26.010614    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:26.509754    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:26.509786    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:26.509799    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:26.509805    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:26.512347    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:27.008498    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:27.008592    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:27.008608    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:27.008615    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:27.011956    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:27.509028    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:27.509110    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:27.509125    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:27.509132    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:27.512133    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:27.512700    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:28.008990    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:28.009083    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:28.009097    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:28.009103    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:28.012126    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:28.509594    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:28.509612    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:28.509621    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:28.509625    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:28.512206    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:29.009613    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:29.009628    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:29.009634    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:29.009637    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:29.011661    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:29.509044    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:29.509059    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:29.509065    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:29.509068    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:29.511159    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:30.008831    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:30.008905    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:30.008916    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:30.008922    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:30.011246    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:30.011529    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:30.509817    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:30.509832    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:30.509838    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:30.509846    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:30.511920    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:31.008461    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:31.008483    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:31.008493    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:31.008499    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:31.011053    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:31.509184    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:31.509236    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:31.509247    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:31.509252    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:31.511776    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:32.008486    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:32.008510    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:32.008522    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:32.008531    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:32.011649    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:32.012066    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:32.510023    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:32.510037    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:32.510044    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:32.510048    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:32.512097    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:33.010283    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:33.010301    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:33.010310    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:33.010314    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:33.012927    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:33.509693    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:33.509712    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:33.509722    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:33.509726    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:33.512086    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.008568    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:34.008586    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:34.008594    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:34.008599    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:34.010823    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.509266    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:34.509365    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:34.509380    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:34.509386    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:34.512417    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.512850    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:35.009777    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:35.009792    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:35.009799    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:35.009802    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:35.011859    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:35.508525    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:35.508582    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:35.508590    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:35.508596    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:35.510810    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:36.009838    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:36.009864    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:36.009876    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:36.009881    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:36.012816    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:36.509201    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:36.509215    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:36.509265    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:36.509269    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:36.511244    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:37.010038    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:37.010064    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:37.010077    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:37.010083    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:37.013339    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:37.013728    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:37.509315    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:37.509330    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:37.509336    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:37.509339    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:37.511753    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:38.009336    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:38.009405    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:38.009415    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:38.009428    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:38.011725    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:38.508458    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:38.508483    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:38.508493    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:38.508500    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:38.511720    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:39.008429    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:39.008452    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:39.008459    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:39.008463    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:39.010408    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:39.508530    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:39.508555    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:39.508569    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:39.508577    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:39.511916    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:39.512435    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:40.009629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.009648    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.009663    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.009668    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.011742    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.509939    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.509963    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.509976    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.509982    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.512891    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.513173    2954 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 09:56:40.513182    2954 node_ready.go:38] duration metric: took 19.004877925s for node "ha-393000-m03" to be "Ready" ...
	I0731 09:56:40.513193    2954 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:56:40.513230    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:40.513235    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.513241    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.513244    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.517063    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:40.521698    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.521758    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 09:56:40.521763    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.521769    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.521773    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.524012    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.524507    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.524515    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.524521    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.524525    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.526095    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.526522    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.526532    2954 pod_ready.go:81] duration metric: took 4.820449ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.526539    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.526579    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 09:56:40.526584    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.526589    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.526597    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.528189    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.528737    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.528744    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.528750    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.528754    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.530442    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.530775    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.530784    2954 pod_ready.go:81] duration metric: took 4.239462ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.530790    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.530822    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 09:56:40.530827    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.530833    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.530840    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.532590    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.533050    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.533057    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.533062    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.533066    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.534760    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.535110    2954 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.535119    2954 pod_ready.go:81] duration metric: took 4.323936ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.535125    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.535164    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 09:56:40.535170    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.535175    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.535178    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.536947    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.537444    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:40.537451    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.537456    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.537460    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.539136    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.539571    2954 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.539580    2954 pod_ready.go:81] duration metric: took 4.45006ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.539587    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.710116    2954 request.go:629] Waited for 170.494917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 09:56:40.710174    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 09:56:40.710180    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.710187    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.710190    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.712323    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.910582    2954 request.go:629] Waited for 197.870555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.910719    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.910732    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.910743    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.910750    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.913867    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:40.914265    2954 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.914278    2954 pod_ready.go:81] duration metric: took 374.68494ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.914293    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.110758    2954 request.go:629] Waited for 196.414025ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:56:41.110829    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:56:41.110835    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.110841    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.110844    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.112890    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:41.311962    2954 request.go:629] Waited for 198.609388ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:41.311995    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:41.312000    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.312006    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.312010    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.314041    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:41.314399    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:41.314410    2954 pod_ready.go:81] duration metric: took 400.109149ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.314418    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.511371    2954 request.go:629] Waited for 196.905615ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:56:41.511497    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:56:41.511508    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.511519    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.511526    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.514702    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:41.710099    2954 request.go:629] Waited for 194.801702ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:41.710131    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:41.710137    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.710143    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.710148    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.711902    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:41.712201    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:41.712211    2954 pod_ready.go:81] duration metric: took 397.788368ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.712225    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.910472    2954 request.go:629] Waited for 198.191914ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 09:56:41.910629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 09:56:41.910640    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.910651    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.910657    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.913895    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:42.111114    2954 request.go:629] Waited for 196.678487ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:42.111206    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:42.111214    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.111222    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.111228    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.113500    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.113867    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.113876    2954 pod_ready.go:81] duration metric: took 401.646528ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.113883    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.310054    2954 request.go:629] Waited for 196.129077ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:56:42.310144    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:56:42.310151    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.310157    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.310161    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.312081    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:42.510104    2954 request.go:629] Waited for 197.491787ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:42.510220    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:42.510230    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.510241    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.510249    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.512958    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.513508    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.513521    2954 pod_ready.go:81] duration metric: took 399.632057ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.513531    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.710421    2954 request.go:629] Waited for 196.851281ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:56:42.710510    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:56:42.710517    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.710523    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.710527    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.713018    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.910158    2954 request.go:629] Waited for 196.774024ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:42.910295    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:42.910307    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.910319    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.910327    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.913021    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.913406    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.913416    2954 pod_ready.go:81] duration metric: took 399.880068ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.913423    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.110445    2954 request.go:629] Waited for 196.965043ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 09:56:43.110548    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 09:56:43.110603    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.110615    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.110630    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.113588    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.311083    2954 request.go:629] Waited for 196.925492ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:43.311134    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:43.311139    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.311146    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.311149    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.313184    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.313462    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:43.313472    2954 pod_ready.go:81] duration metric: took 400.04465ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.313479    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.510584    2954 request.go:629] Waited for 197.060501ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:56:43.510710    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:56:43.510722    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.510731    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.510737    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.513575    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.710025    2954 request.go:629] Waited for 195.991998ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:43.710104    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:43.710111    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.710117    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.710121    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.712314    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.712653    2954 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:43.712663    2954 pod_ready.go:81] duration metric: took 399.178979ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.712670    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.910041    2954 request.go:629] Waited for 197.319656ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 09:56:43.910085    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 09:56:43.910092    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.910100    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.910108    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.913033    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.110409    2954 request.go:629] Waited for 196.775647ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:44.110512    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:44.110520    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.110526    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.110530    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.112726    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.113050    2954 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.113060    2954 pod_ready.go:81] duration metric: took 400.385455ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.113067    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.310143    2954 request.go:629] Waited for 197.043092ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:56:44.310236    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:56:44.310243    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.310253    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.310258    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.312471    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.510561    2954 request.go:629] Waited for 197.642859ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.510715    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.510728    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.510742    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.510750    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.513815    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:44.514349    2954 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.514363    2954 pod_ready.go:81] duration metric: took 401.290361ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.514372    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.711407    2954 request.go:629] Waited for 196.995177ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:56:44.711475    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:56:44.711482    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.711488    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.711491    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.713573    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.910056    2954 request.go:629] Waited for 196.042855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.910095    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.910103    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.910112    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.910117    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.912608    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.912924    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.912934    2954 pod_ready.go:81] duration metric: took 398.555138ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.912941    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.112001    2954 request.go:629] Waited for 199.012783ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:56:45.112114    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:56:45.112125    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.112136    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.112142    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.115328    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:45.310138    2954 request.go:629] Waited for 194.249421ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:45.310197    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:45.310207    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.310217    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.310226    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.315131    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:45.315432    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:45.315442    2954 pod_ready.go:81] duration metric: took 402.495485ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.315449    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.510510    2954 request.go:629] Waited for 195.017136ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 09:56:45.510595    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 09:56:45.510601    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.510607    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.510614    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.512663    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:45.709970    2954 request.go:629] Waited for 196.900157ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:45.710056    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:45.710063    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.710069    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.710073    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.712279    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:45.712540    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:45.712550    2954 pod_ready.go:81] duration metric: took 397.095893ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.712557    2954 pod_ready.go:38] duration metric: took 5.199358243s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:56:45.712568    2954 api_server.go:52] waiting for apiserver process to appear ...
	I0731 09:56:45.712620    2954 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:56:45.724210    2954 api_server.go:72] duration metric: took 24.575970869s to wait for apiserver process to appear ...
	I0731 09:56:45.724224    2954 api_server.go:88] waiting for apiserver healthz status ...
	I0731 09:56:45.724236    2954 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:56:45.729801    2954 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:56:45.729848    2954 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 09:56:45.729855    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.729862    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.729867    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.731097    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:45.731132    2954 api_server.go:141] control plane version: v1.30.3
	I0731 09:56:45.731141    2954 api_server.go:131] duration metric: took 6.912618ms to wait for apiserver health ...
	I0731 09:56:45.731147    2954 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 09:56:45.910423    2954 request.go:629] Waited for 179.236536ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:45.910520    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:45.910529    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.910537    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.910541    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.914926    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:45.919715    2954 system_pods.go:59] 24 kube-system pods found
	I0731 09:56:45.919728    2954 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:56:45.919732    2954 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:56:45.919735    2954 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:56:45.919738    2954 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:56:45.919742    2954 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 09:56:45.919745    2954 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:56:45.919748    2954 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:56:45.919750    2954 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 09:56:45.919753    2954 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:56:45.919756    2954 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:56:45.919759    2954 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 09:56:45.919761    2954 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:56:45.919764    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:56:45.919767    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 09:56:45.919770    2954 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:56:45.919773    2954 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 09:56:45.919776    2954 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:56:45.919778    2954 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:56:45.919780    2954 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:56:45.919783    2954 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 09:56:45.919785    2954 system_pods.go:61] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:56:45.919789    2954 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:56:45.919792    2954 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 09:56:45.919795    2954 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:56:45.919799    2954 system_pods.go:74] duration metric: took 188.647794ms to wait for pod list to return data ...
	I0731 09:56:45.919808    2954 default_sa.go:34] waiting for default service account to be created ...
	I0731 09:56:46.110503    2954 request.go:629] Waited for 190.648848ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:56:46.110629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:56:46.110641    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.110653    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.110659    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.113864    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:46.113948    2954 default_sa.go:45] found service account: "default"
	I0731 09:56:46.113959    2954 default_sa.go:55] duration metric: took 194.145984ms for default service account to be created ...
	I0731 09:56:46.113966    2954 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 09:56:46.310339    2954 request.go:629] Waited for 196.331355ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:46.310381    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:46.310387    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.310420    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.310424    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.314581    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:46.318894    2954 system_pods.go:86] 24 kube-system pods found
	I0731 09:56:46.318910    2954 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:56:46.318914    2954 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:56:46.318918    2954 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:56:46.318921    2954 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:56:46.318926    2954 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 09:56:46.318931    2954 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:56:46.318934    2954 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:56:46.318939    2954 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 09:56:46.318942    2954 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:56:46.318946    2954 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:56:46.318950    2954 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 09:56:46.318955    2954 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:56:46.318958    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:56:46.318963    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 09:56:46.318966    2954 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:56:46.318970    2954 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 09:56:46.318973    2954 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:56:46.318976    2954 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:56:46.318980    2954 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:56:46.318983    2954 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 09:56:46.318987    2954 system_pods.go:89] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:56:46.318990    2954 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:56:46.318993    2954 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 09:56:46.318996    2954 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:56:46.319002    2954 system_pods.go:126] duration metric: took 205.029246ms to wait for k8s-apps to be running ...
	I0731 09:56:46.319007    2954 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 09:56:46.319063    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:56:46.330197    2954 system_svc.go:56] duration metric: took 11.183343ms WaitForService to wait for kubelet
	I0731 09:56:46.330213    2954 kubeadm.go:582] duration metric: took 25.181975511s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:56:46.330225    2954 node_conditions.go:102] verifying NodePressure condition ...
	I0731 09:56:46.509976    2954 request.go:629] Waited for 179.711714ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 09:56:46.510033    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 09:56:46.510039    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.510045    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.510049    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.512677    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:46.513343    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513352    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513358    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513361    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513364    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513367    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513371    2954 node_conditions.go:105] duration metric: took 183.142994ms to run NodePressure ...
	I0731 09:56:46.513378    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:56:46.513392    2954 start.go:255] writing updated cluster config ...
	I0731 09:56:46.513784    2954 ssh_runner.go:195] Run: rm -f paused
	I0731 09:56:46.555311    2954 start.go:600] kubectl: 1.29.2, cluster: 1.30.3 (minor skew: 1)
	I0731 09:56:46.577040    2954 out.go:177] * Done! kubectl is now configured to use "ha-393000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/25b3d6db405f49d365d6f33539e94ee4547921a7d0c463b94585056341530cda/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/c2a288a20831d0407ed1a2c3eeeb19a9758ef98813b916541258c8c58bcce38c/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/480020f5f9c0ce2e553e007beff5dfbe53b17bd2beaa73039be50701f04b9e76/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428712215Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428950502Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428960130Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.429078581Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477484798Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477564679Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477577219Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477869035Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507078466Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507147792Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507166914Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507244276Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853207982Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853706000Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853772518Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.854059851Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:56:47Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/e9ce137a2245c1333d3f3961469d32237e88656784f689211ed86cae2fd5518f/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Jul 31 16:56:49 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:56:49Z" level=info msg="Stop pulling image gcr.io/k8s-minikube/busybox:1.28: Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:1.28"
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157487366Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157549945Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157563641Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.158058722Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   About a minute ago   Running             busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	6d966e37d3618       6e38f40d628db                                                                                         3 minutes ago        Running             storage-provisioner       0                   25b3d6db405f4       storage-provisioner
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              4 minutes ago        Running             kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         4 minutes ago        Running             kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	e68314e525ef8       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     4 minutes ago        Running             kube-vip                  0                   c9f21d49b1384       kube-vip-ha-393000
	ab4f453cbe097       1f6d574d502f3                                                                                         4 minutes ago        Running             kube-apiserver            0                   7dc7f319faa98       kube-apiserver-ha-393000
	63e56744c84ee       3861cfcd7c04c                                                                                         4 minutes ago        Running             etcd                      0                   f8f20b1290499       etcd-ha-393000
	e19f7878939c9       76932a3b37d7e                                                                                         4 minutes ago        Running             kube-controller-manager   0                   67c995d2d2a3b       kube-controller-manager-ha-393000
	65412448c586b       3edc18e7b7672                                                                                         4 minutes ago        Running             kube-scheduler            0                   7ab9affa89eca       kube-scheduler-ha-393000
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:34336 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000091143s
	[INFO] 10.244.2.2:60404 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000085158s
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	
	
	==> coredns [feda36fb8a03] <==
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:43418 - 53237 "HINFO IN 5926041632293031093.721085148118182160. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.013101738s
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	
	
	==> describe nodes <==
	Name:               ha-393000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:53:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:08 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:54:24 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-393000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 baf02d554c20474b9fadb280fa1b8544
	  System UUID:                2cfe48dd-0000-0000-9b98-537ad9823a95
	  Boot ID:                    d6aa7e74-2f58-4a9d-a5df-37153dda8239
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-b94zr              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         85s
	  kube-system                 coredns-7db6d8ff4d-5m8st             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     4m7s
	  kube-system                 coredns-7db6d8ff4d-wvqjl             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     4m7s
	  kube-system                 etcd-ha-393000                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         4m21s
	  kube-system                 kindnet-hjm7c                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      4m7s
	  kube-system                 kube-apiserver-ha-393000             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m21s
	  kube-system                 kube-controller-manager-ha-393000    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m22s
	  kube-system                 kube-proxy-zc52f                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m7s
	  kube-system                 kube-scheduler-ha-393000             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m21s
	  kube-system                 kube-vip-ha-393000                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m24s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m6s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 4m5s   kube-proxy       
	  Normal  Starting                 4m21s  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  4m21s  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  4m21s  kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m21s  kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m21s  kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           4m8s   node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeReady                3m48s  kubelet          Node ha-393000 status is now: NodeReady
	  Normal  RegisteredNode           2m49s  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           97s    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	
	
	Name:               ha-393000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:55:06 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:10 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:27 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-393000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 ef1036a76f3140bd891095c317498193
	  System UUID:                7863443c-0000-0000-8e8d-bbd47bc06547
	  Boot ID:                    d1d2508d-2745-4c36-9513-9d28d75304e0
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-zln22                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         85s
	  kube-system                 etcd-ha-393000-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         3m4s
	  kube-system                 kindnet-lcwbs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      3m6s
	  kube-system                 kube-apiserver-ha-393000-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m4s
	  kube-system                 kube-controller-manager-ha-393000-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m4s
	  kube-system                 kube-proxy-cf577                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m6s
	  kube-system                 kube-scheduler-ha-393000-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m4s
	  kube-system                 kube-vip-ha-393000-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m2s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 3m1s                 kube-proxy       
	  Normal  NodeHasSufficientMemory  3m6s (x8 over 3m6s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    3m6s (x8 over 3m6s)  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     3m6s (x7 over 3m6s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  3m6s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           3m3s                 node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal  RegisteredNode           2m49s                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal  RegisteredNode           97s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	
	
	Name:               ha-393000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:56:18 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:10 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:40 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-393000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 86f4bf9242d1461e9aec7b900dfd2277
	  System UUID:                451d42a6-0000-0000-8ccb-b8851dda0594
	  Boot ID:                    07f25a3c-b688-461e-9d49-0a60051d0c3c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-n8d7h                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         85s
	  kube-system                 etcd-ha-393000-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         112s
	  kube-system                 kindnet-s2pv6                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      114s
	  kube-system                 kube-apiserver-ha-393000-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         113s
	  kube-system                 kube-controller-manager-ha-393000-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         112s
	  kube-system                 kube-proxy-cr9pg                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         114s
	  kube-system                 kube-scheduler-ha-393000-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         113s
	  kube-system                 kube-vip-ha-393000-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         110s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 110s                 kube-proxy       
	  Normal  NodeHasSufficientMemory  114s (x8 over 114s)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    114s (x8 over 114s)  kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     114s (x7 over 114s)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  114s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           113s                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           109s                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           97s                  node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	
	
	==> dmesg <==
	[  +2.764750] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.236579] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.776173] systemd-fstab-generator[496]: Ignoring "noauto" option for root device
	[  +0.099418] systemd-fstab-generator[508]: Ignoring "noauto" option for root device
	[  +1.822617] systemd-fstab-generator[843]: Ignoring "noauto" option for root device
	[  +0.280031] systemd-fstab-generator[881]: Ignoring "noauto" option for root device
	[  +0.062769] kauditd_printk_skb: 95 callbacks suppressed
	[  +0.051458] systemd-fstab-generator[893]: Ignoring "noauto" option for root device
	[  +0.120058] systemd-fstab-generator[907]: Ignoring "noauto" option for root device
	[  +2.468123] systemd-fstab-generator[1123]: Ignoring "noauto" option for root device
	[  +0.099873] systemd-fstab-generator[1135]: Ignoring "noauto" option for root device
	[  +0.092257] systemd-fstab-generator[1147]: Ignoring "noauto" option for root device
	[  +0.106918] systemd-fstab-generator[1162]: Ignoring "noauto" option for root device
	[  +3.770701] systemd-fstab-generator[1268]: Ignoring "noauto" option for root device
	[  +0.056009] kauditd_printk_skb: 180 callbacks suppressed
	[  +2.552095] systemd-fstab-generator[1523]: Ignoring "noauto" option for root device
	[  +4.084188] systemd-fstab-generator[1702]: Ignoring "noauto" option for root device
	[  +0.054525] kauditd_printk_skb: 70 callbacks suppressed
	[  +7.033653] systemd-fstab-generator[2202]: Ignoring "noauto" option for root device
	[  +0.072815] kauditd_printk_skb: 72 callbacks suppressed
	[Jul31 16:54] kauditd_printk_skb: 12 callbacks suppressed
	[ +19.132251] kauditd_printk_skb: 38 callbacks suppressed
	[Jul31 16:55] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [63e56744c84e] <==
	{"level":"info","ts":"2024-07-31T16:55:08.065421Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-07-31T16:55:08.065437Z","caller":"etcdserver/server.go:1946","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T16:56:18.524077Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(2035864250365333051 13314548521573537860) learners=(14707668837576794450)"}
	{"level":"info","ts":"2024-07-31T16:56:18.525183Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844","added-peer-id":"cc1c22e219d8e152","added-peer-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-07-31T16:56:18.525227Z","caller":"rafthttp/peer.go:133","msg":"starting remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.525267Z","caller":"rafthttp/pipeline.go:72","msg":"started HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.525776Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.52608Z","caller":"rafthttp/peer.go:137","msg":"started remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.526181Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.526208Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.526232Z","caller":"rafthttp/transport.go:317","msg":"added remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-07-31T16:56:18.526495Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"warn","ts":"2024-07-31T16:56:18.572765Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"cc1c22e219d8e152","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"warn","ts":"2024-07-31T16:56:19.066544Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"cc1c22e219d8e152","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-07-31T16:56:19.78495Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:19.785013Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:19.792429Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:19.81362Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"cc1c22e219d8e152","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-07-31T16:56:19.813712Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:19.850768Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"cc1c22e219d8e152","stream-type":"stream Message"}
	{"level":"info","ts":"2024-07-31T16:56:19.850881Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"warn","ts":"2024-07-31T16:56:20.066154Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"cc1c22e219d8e152","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-07-31T16:56:20.566995Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(2035864250365333051 13314548521573537860 14707668837576794450)"}
	{"level":"info","ts":"2024-07-31T16:56:20.567341Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-07-31T16:56:20.567501Z","caller":"etcdserver/server.go:1946","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"cc1c22e219d8e152"}
	
	
	==> kernel <==
	 16:58:12 up 4 min,  0 users,  load average: 0.76, 0.48, 0.21
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:57:30.116433       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:57:40.110429       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:57:40.110467       1 main.go:299] handling current node
	I0731 16:57:40.110480       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:57:40.110484       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:57:40.110718       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:57:40.110749       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:57:50.109884       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:57:50.109964       1 main.go:299] handling current node
	I0731 16:57:50.109980       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:57:50.110115       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:57:50.110404       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:57:50.110446       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:58:00.116121       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:58:00.116198       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:58:00.116281       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:58:00.116321       1 main.go:299] handling current node
	I0731 16:58:00.116341       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:58:00.116353       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:58:10.110132       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:58:10.110172       1 main.go:299] handling current node
	I0731 16:58:10.110185       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:58:10.110190       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:58:10.110340       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:58:10.110368       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [ab4f453cbe09] <==
	I0731 16:53:49.787246       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0731 16:53:49.838971       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0731 16:53:49.842649       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0731 16:53:49.843317       1 controller.go:615] quota admission added evaluator for: endpoints
	I0731 16:53:49.845885       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0731 16:53:50.451090       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0731 16:53:51.578858       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0731 16:53:51.587918       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0731 16:53:51.594571       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0731 16:54:05.505988       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	I0731 16:54:05.655031       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	E0731 16:56:52.014947       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51195: use of closed network connection
	E0731 16:56:52.206354       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51197: use of closed network connection
	E0731 16:56:52.403109       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51199: use of closed network connection
	E0731 16:56:52.600256       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51201: use of closed network connection
	E0731 16:56:52.785054       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51203: use of closed network connection
	E0731 16:56:53.004706       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51205: use of closed network connection
	E0731 16:56:53.208399       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51207: use of closed network connection
	E0731 16:56:53.392187       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51209: use of closed network connection
	E0731 16:56:53.714246       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51212: use of closed network connection
	E0731 16:56:53.895301       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51214: use of closed network connection
	E0731 16:56:54.078794       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51216: use of closed network connection
	E0731 16:56:54.262767       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51218: use of closed network connection
	E0731 16:56:54.448344       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51220: use of closed network connection
	E0731 16:56:54.629926       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51222: use of closed network connection
	
	
	==> kube-controller-manager [e19f7878939c] <==
	I0731 16:54:25.766270       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="26.902µs"
	I0731 16:54:29.808610       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0731 16:55:06.430472       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m02\" does not exist"
	I0731 16:55:06.448216       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m02" podCIDRs=["10.244.1.0/24"]
	I0731 16:55:09.814349       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m02"
	E0731 16:56:18.277948       1 certificate_controller.go:146] Sync csr-v42tm failed with : error updating signature for csr: Operation cannot be fulfilled on certificatesigningrequests.certificates.k8s.io "csr-v42tm": the object has been modified; please apply your changes to the latest version and try again
	I0731 16:56:18.384134       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m03\" does not exist"
	I0731 16:56:18.398095       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m03" podCIDRs=["10.244.2.0/24"]
	I0731 16:56:19.822872       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m03"
	I0731 16:56:47.522324       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="152.941157ms"
	I0731 16:56:47.574976       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="52.539469ms"
	I0731 16:56:47.678922       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="103.895055ms"
	I0731 16:56:47.701560       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="22.534098ms"
	I0731 16:56:47.701787       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="74.391µs"
	I0731 16:56:47.718186       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="8.079697ms"
	I0731 16:56:47.718269       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="39.867µs"
	I0731 16:56:47.744772       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="8.73015ms"
	I0731 16:56:47.745065       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="34.302µs"
	I0731 16:56:48.288860       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.605µs"
	I0731 16:56:49.532986       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.769402ms"
	I0731 16:56:49.533229       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="37.061µs"
	I0731 16:56:49.677499       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.411426ms"
	I0731 16:56:49.677560       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="21.894µs"
	I0731 16:56:51.343350       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="15.340858ms"
	I0731 16:56:51.343434       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.532µs"
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [65412448c586] <==
	W0731 16:53:48.491080       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0731 16:53:48.491132       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0731 16:53:48.491335       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:48.491387       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0731 16:53:48.491507       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0731 16:53:48.491594       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0731 16:53:48.491662       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:48.491738       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:48.491818       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:48.491860       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:48.491537       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:48.491873       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.319781       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0731 16:53:49.319838       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0731 16:53:49.326442       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.326478       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.392116       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:49.392172       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:49.496014       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.496036       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.541411       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:49.541927       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:49.588695       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:49.588735       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0731 16:53:49.982415       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 31 16:54:25 ha-393000 kubelet[2209]: I0731 16:54:25.725648    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-5m8st" podStartSLOduration=20.725636637 podStartE2EDuration="20.725636637s" podCreationTimestamp="2024-07-31 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.724822579 +0000 UTC m=+33.396938994" watchObservedRunningTime="2024-07-31 16:54:25.725636637 +0000 UTC m=+33.397753046"
	Jul 31 16:54:25 ha-393000 kubelet[2209]: I0731 16:54:25.753514    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/storage-provisioner" podStartSLOduration=19.753503033 podStartE2EDuration="19.753503033s" podCreationTimestamp="2024-07-31 16:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.752974741 +0000 UTC m=+33.425091155" watchObservedRunningTime="2024-07-31 16:54:25.753503033 +0000 UTC m=+33.425619443"
	Jul 31 16:54:52 ha-393000 kubelet[2209]: E0731 16:54:52.468990    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:54:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:55:52 ha-393000 kubelet[2209]: E0731 16:55:52.468170    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:55:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.510532    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-wvqjl" podStartSLOduration=162.510247367 podStartE2EDuration="2m42.510247367s" podCreationTimestamp="2024-07-31 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.761498183 +0000 UTC m=+33.433614594" watchObservedRunningTime="2024-07-31 16:56:47.510247367 +0000 UTC m=+175.182363776"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.510944    2209 topology_manager.go:215] "Topology Admit Handler" podUID="dd382c29-63af-44cb-bf5b-b7db27f11017" podNamespace="default" podName="busybox-fc5497c4f-b94zr"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.640155    2209 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8k4\" (UniqueName: \"kubernetes.io/projected/dd382c29-63af-44cb-bf5b-b7db27f11017-kube-api-access-cp8k4\") pod \"busybox-fc5497c4f-b94zr\" (UID: \"dd382c29-63af-44cb-bf5b-b7db27f11017\") " pod="default/busybox-fc5497c4f-b94zr"
	Jul 31 16:56:52 ha-393000 kubelet[2209]: E0731 16:56:52.472632    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:56:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:57:52 ha-393000 kubelet[2209]: E0731 16:57:52.468077    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:57:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:261: (dbg) Run:  kubectl --context ha-393000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/AddWorkerNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/AddWorkerNode (79.37s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (3.28s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:326: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status --output json -v=7 --alsologtostderr
ha_test.go:326: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status --output json -v=7 --alsologtostderr: exit status 2 (454.538978ms)

                                                
                                                
-- stdout --
	[{"Name":"ha-393000","Host":"Running","Kubelet":"Running","APIServer":"Running","Kubeconfig":"Configured","Worker":false},{"Name":"ha-393000-m02","Host":"Running","Kubelet":"Running","APIServer":"Running","Kubeconfig":"Configured","Worker":false},{"Name":"ha-393000-m03","Host":"Running","Kubelet":"Running","APIServer":"Running","Kubeconfig":"Configured","Worker":false},{"Name":"ha-393000-m04","Host":"Running","Kubelet":"Stopped","APIServer":"Irrelevant","Kubeconfig":"Irrelevant","Worker":true}]

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:58:14.320792    3131 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:58:14.320979    3131 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:58:14.320985    3131 out.go:304] Setting ErrFile to fd 2...
	I0731 09:58:14.320988    3131 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:58:14.321180    3131 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:58:14.321356    3131 out.go:298] Setting JSON to true
	I0731 09:58:14.321387    3131 mustload.go:65] Loading cluster: ha-393000
	I0731 09:58:14.321455    3131 notify.go:220] Checking for updates...
	I0731 09:58:14.321727    3131 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:58:14.321743    3131 status.go:255] checking status of ha-393000 ...
	I0731 09:58:14.322083    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.322123    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.330993    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51305
	I0731 09:58:14.331311    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.331721    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.331730    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.331962    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.332075    3131 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:58:14.332169    3131 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:58:14.332236    3131 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:58:14.333238    3131 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:58:14.333258    3131 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:58:14.333500    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.333536    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.341896    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51307
	I0731 09:58:14.342237    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.342579    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.342597    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.342835    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.342937    3131 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:58:14.343035    3131 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:58:14.343310    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.343335    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.352067    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51309
	I0731 09:58:14.352392    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.352737    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.352753    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.352950    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.353060    3131 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:58:14.353205    3131 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:58:14.353223    3131 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:58:14.353291    3131 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:58:14.353372    3131 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:58:14.353449    3131 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:58:14.353530    3131 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:58:14.390541    3131 ssh_runner.go:195] Run: systemctl --version
	I0731 09:58:14.394709    3131 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:58:14.406210    3131 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:58:14.406235    3131 api_server.go:166] Checking apiserver status ...
	I0731 09:58:14.406272    3131 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:58:14.418779    3131 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:58:14.426816    3131 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:58:14.426866    3131 ssh_runner.go:195] Run: ls
	I0731 09:58:14.430004    3131 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:58:14.433085    3131 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:58:14.433095    3131 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:58:14.433104    3131 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:58:14.433114    3131 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:58:14.433358    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.433378    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.441910    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51313
	I0731 09:58:14.442243    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.442581    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.442595    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.442805    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.442929    3131 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:58:14.443016    3131 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:58:14.443089    3131 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:58:14.444103    3131 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:58:14.444113    3131 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:58:14.444370    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.444396    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.453250    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51315
	I0731 09:58:14.453600    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.453933    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.453953    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.454172    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.454281    3131 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:58:14.454362    3131 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:58:14.454633    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.454663    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.463448    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51317
	I0731 09:58:14.463800    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.464133    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.464149    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.464358    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.464467    3131 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:58:14.464586    3131 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:58:14.464597    3131 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:58:14.464678    3131 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:58:14.464759    3131 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:58:14.464840    3131 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:58:14.464924    3131 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:58:14.499736    3131 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:58:14.511777    3131 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:58:14.511793    3131 api_server.go:166] Checking apiserver status ...
	I0731 09:58:14.511830    3131 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:58:14.524499    3131 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1991/cgroup
	W0731 09:58:14.532105    3131 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1991/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:58:14.532154    3131 ssh_runner.go:195] Run: ls
	I0731 09:58:14.535478    3131 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:58:14.538589    3131 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:58:14.538599    3131 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:58:14.538607    3131 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:58:14.538617    3131 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:58:14.538878    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.538900    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.547521    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51321
	I0731 09:58:14.547865    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.548199    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.548213    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.548414    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.548524    3131 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:58:14.548600    3131 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:58:14.548677    3131 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:58:14.549693    3131 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:58:14.549703    3131 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:58:14.549944    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.549972    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.558609    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51323
	I0731 09:58:14.558936    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.559278    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.559295    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.559518    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.559629    3131 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:58:14.559715    3131 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:58:14.559976    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.560006    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.568536    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51325
	I0731 09:58:14.568863    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.569182    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.569199    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.569413    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.569539    3131 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:58:14.569670    3131 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:58:14.569689    3131 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:58:14.569766    3131 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:58:14.569845    3131 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:58:14.569914    3131 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:58:14.569985    3131 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:58:14.605794    3131 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:58:14.618307    3131 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:58:14.618322    3131 api_server.go:166] Checking apiserver status ...
	I0731 09:58:14.618367    3131 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:58:14.629776    3131 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:58:14.637455    3131 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:58:14.637499    3131 ssh_runner.go:195] Run: ls
	I0731 09:58:14.640648    3131 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:58:14.643710    3131 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:58:14.643721    3131 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:58:14.643729    3131 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:58:14.643739    3131 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:58:14.644022    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.644045    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.652884    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51329
	I0731 09:58:14.653225    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.653570    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.653588    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.653827    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.653948    3131 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:58:14.654047    3131 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:58:14.654127    3131 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:58:14.655130    3131 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:58:14.655140    3131 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:58:14.655400    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.655423    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.664038    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51331
	I0731 09:58:14.664352    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.664721    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.664740    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.664939    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.665040    3131 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:58:14.665133    3131 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:58:14.665404    3131 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:14.665432    3131 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:14.673899    3131 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51333
	I0731 09:58:14.674248    3131 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:14.674587    3131 main.go:141] libmachine: Using API Version  1
	I0731 09:58:14.674602    3131 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:14.674820    3131 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:14.674920    3131 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:58:14.675039    3131 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:58:14.675051    3131 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:58:14.675127    3131 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:58:14.675200    3131 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:58:14.675269    3131 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:58:14.675347    3131 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:58:14.708130    3131 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:58:14.719406    3131 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:328: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-393000 status --output json -v=7 --alsologtostderr" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:244: <<< TestMultiControlPlane/serial/CopyFile FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/CopyFile]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
E0731 09:58:16.431212    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (2.14430618s)
helpers_test.go:252: TestMultiControlPlane/serial/CopyFile logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| image   | functional-680000 image build -t     | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:52 PDT | 31 Jul 24 09:52 PDT |
	|         | localhost/my-image:functional-680000 |                   |         |         |                     |                     |
	|         | testdata/build --alsologtostderr     |                   |         |         |                     |                     |
	| image   | functional-680000 image ls           | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:52 PDT | 31 Jul 24 09:52 PDT |
	| delete  | -p functional-680000                 | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:53 PDT | 31 Jul 24 09:53 PDT |
	| start   | -p ha-393000 --wait=true             | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:53 PDT | 31 Jul 24 09:56 PDT |
	|         | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|         | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|         | --driver=hyperkit                    |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- apply -f             | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- rollout status       | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 09:53:16
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 09:53:16.140722    2954 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:53:16.140891    2954 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:53:16.140897    2954 out.go:304] Setting ErrFile to fd 2...
	I0731 09:53:16.140901    2954 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:53:16.141085    2954 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:53:16.142669    2954 out.go:298] Setting JSON to false
	I0731 09:53:16.166361    2954 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1366,"bootTime":1722443430,"procs":467,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:53:16.166460    2954 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:53:16.192371    2954 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 09:53:16.233499    2954 notify.go:220] Checking for updates...
	I0731 09:53:16.263444    2954 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 09:53:16.328756    2954 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:53:16.398694    2954 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:53:16.420465    2954 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:53:16.443406    2954 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:53:16.464565    2954 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 09:53:16.486871    2954 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:53:16.517461    2954 out.go:177] * Using the hyperkit driver based on user configuration
	I0731 09:53:16.559490    2954 start.go:297] selected driver: hyperkit
	I0731 09:53:16.559519    2954 start.go:901] validating driver "hyperkit" against <nil>
	I0731 09:53:16.559538    2954 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 09:53:16.563960    2954 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:53:16.564071    2954 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 09:53:16.572413    2954 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 09:53:16.576399    2954 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:53:16.576420    2954 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 09:53:16.576454    2954 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 09:53:16.576646    2954 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:53:16.576708    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:16.576719    2954 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0731 09:53:16.576725    2954 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0731 09:53:16.576791    2954 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docke
r CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0
GPUs: AutoPauseInterval:1m0s}
	I0731 09:53:16.576877    2954 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:53:16.619419    2954 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 09:53:16.640390    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:53:16.640480    2954 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 09:53:16.640509    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:53:16.640712    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:53:16.640731    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:53:16.641227    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:53:16.641275    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json: {Name:mka52f595799559e261228b691f11b60413ee780 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:16.641876    2954 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:53:16.641986    2954 start.go:364] duration metric: took 90.888µs to acquireMachinesLock for "ha-393000"
	I0731 09:53:16.642025    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType
:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:53:16.642108    2954 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 09:53:16.663233    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:53:16.663389    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:53:16.663426    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:53:16.672199    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51037
	I0731 09:53:16.672559    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:53:16.672976    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:53:16.672987    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:53:16.673241    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:53:16.673369    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:16.673473    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:16.673584    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:53:16.673605    2954 client.go:168] LocalClient.Create starting
	I0731 09:53:16.673642    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:53:16.673693    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:53:16.673710    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:53:16.673763    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:53:16.673801    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:53:16.673815    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:53:16.673840    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:53:16.673850    2954 main.go:141] libmachine: (ha-393000) Calling .PreCreateCheck
	I0731 09:53:16.673929    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:16.674073    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:16.684622    2954 main.go:141] libmachine: Creating machine...
	I0731 09:53:16.684647    2954 main.go:141] libmachine: (ha-393000) Calling .Create
	I0731 09:53:16.684806    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:16.685170    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.684943    2962 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:53:16.685305    2954 main.go:141] libmachine: (ha-393000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:53:16.866642    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.866533    2962 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa...
	I0731 09:53:16.907777    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.907707    2962 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk...
	I0731 09:53:16.907795    2954 main.go:141] libmachine: (ha-393000) DBG | Writing magic tar header
	I0731 09:53:16.907815    2954 main.go:141] libmachine: (ha-393000) DBG | Writing SSH key tar header
	I0731 09:53:16.908296    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.908249    2962 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000 ...
	I0731 09:53:17.278530    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:17.278549    2954 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 09:53:17.278657    2954 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 09:53:17.388690    2954 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 09:53:17.388709    2954 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:53:17.388758    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:53:17.388793    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:53:17.388830    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:53:17.388871    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:53:17.388884    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:53:17.391787    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Pid is 2965
	I0731 09:53:17.392177    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 09:53:17.392188    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:17.392264    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:17.393257    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:17.393317    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:17.393342    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:17.393359    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:17.393369    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:17.399449    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:53:17.451566    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:53:17.452146    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:53:17.452168    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:53:17.452176    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:53:17.452184    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:53:17.832667    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:53:17.832680    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:53:17.947165    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:53:17.947181    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:53:17.947203    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:53:17.947214    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:53:17.948083    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:53:17.948094    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:53:19.393474    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 1
	I0731 09:53:19.393491    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:19.393544    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:19.394408    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:19.394431    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:19.394439    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:19.394449    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:19.394461    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:21.396273    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 2
	I0731 09:53:21.396290    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:21.396404    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:21.397210    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:21.397262    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:21.397275    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:21.397283    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:21.397292    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:23.397619    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 3
	I0731 09:53:23.397635    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:23.397733    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:23.398576    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:23.398585    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:23.398595    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:23.398604    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:23.398623    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:23.511265    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 09:53:23.511317    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 09:53:23.511327    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 09:53:23.534471    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 09:53:25.399722    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 4
	I0731 09:53:25.399735    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:25.399799    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:25.400596    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:25.400655    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:25.400665    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:25.400672    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:25.400681    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:27.400848    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 5
	I0731 09:53:27.400872    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:27.400976    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:27.401778    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:27.401824    2954 main.go:141] libmachine: (ha-393000) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:53:27.401836    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:53:27.401845    2954 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 09:53:27.401856    2954 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 09:53:27.401921    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:27.402530    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:27.402623    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:27.402706    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:53:27.402714    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:53:27.402795    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:27.402846    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:27.403621    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:53:27.403635    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:53:27.403641    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:53:27.403647    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:27.403727    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:27.403804    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:27.403889    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:27.403968    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:27.404083    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:27.404258    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:27.404265    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:53:28.471124    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:53:28.471139    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:53:28.471151    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.471303    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.471413    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.471516    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.471604    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.471751    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.471894    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.471902    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:53:28.534700    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:53:28.534755    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:53:28.534761    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:53:28.534766    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.534914    2954 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 09:53:28.534924    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.535023    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.535122    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.535205    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.535305    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.535404    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.535525    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.535678    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.535686    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 09:53:28.612223    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 09:53:28.612243    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.612383    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.612495    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.612585    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.612692    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.612835    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.612989    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.613000    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:53:28.684692    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:53:28.684711    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:53:28.684731    2954 buildroot.go:174] setting up certificates
	I0731 09:53:28.684742    2954 provision.go:84] configureAuth start
	I0731 09:53:28.684753    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.684892    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:28.684986    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.685097    2954 provision.go:143] copyHostCerts
	I0731 09:53:28.685132    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:53:28.685202    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:53:28.685210    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:53:28.685348    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:53:28.685544    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:53:28.685575    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:53:28.685580    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:53:28.685671    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:53:28.685817    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:53:28.685858    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:53:28.685863    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:53:28.685947    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:53:28.686099    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 09:53:28.975770    2954 provision.go:177] copyRemoteCerts
	I0731 09:53:28.975860    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:53:28.975879    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.976044    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.976151    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.976253    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.976368    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:29.014295    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:53:29.014364    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0731 09:53:29.033836    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:53:29.033901    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:53:29.053674    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:53:29.053744    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:53:29.073245    2954 provision.go:87] duration metric: took 388.494938ms to configureAuth
	I0731 09:53:29.073258    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:53:29.073388    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:53:29.073402    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:29.073538    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.073618    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.073712    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.073794    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.073871    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.073977    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.074114    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.074121    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:53:29.138646    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:53:29.138660    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:53:29.138727    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:53:29.138739    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.138887    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.138979    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.139070    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.139173    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.139333    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.139499    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.139544    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:53:29.214149    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:53:29.214180    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.214320    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.214403    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.214495    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.214599    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.214718    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.214856    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.214868    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:53:30.823417    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:53:30.823433    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:53:30.823439    2954 main.go:141] libmachine: (ha-393000) Calling .GetURL
	I0731 09:53:30.823574    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:53:30.823582    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:53:30.823587    2954 client.go:171] duration metric: took 14.150104113s to LocalClient.Create
	I0731 09:53:30.823598    2954 start.go:167] duration metric: took 14.150148374s to libmachine.API.Create "ha-393000"
	I0731 09:53:30.823607    2954 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 09:53:30.823621    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:53:30.823633    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.823781    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:53:30.823793    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.823880    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.823974    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.824065    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.824160    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:30.868545    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:53:30.872572    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:53:30.872587    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:53:30.872696    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:53:30.872889    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:53:30.872896    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:53:30.873123    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:53:30.890087    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:53:30.911977    2954 start.go:296] duration metric: took 88.361428ms for postStartSetup
	I0731 09:53:30.912003    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:30.912600    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:30.912759    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:53:30.913103    2954 start.go:128] duration metric: took 14.271109881s to createHost
	I0731 09:53:30.913117    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.913201    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.913305    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.913399    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.913473    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.913588    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:30.913703    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:30.913711    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:53:30.978737    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444810.120322538
	
	I0731 09:53:30.978750    2954 fix.go:216] guest clock: 1722444810.120322538
	I0731 09:53:30.978755    2954 fix.go:229] Guest: 2024-07-31 09:53:30.120322538 -0700 PDT Remote: 2024-07-31 09:53:30.913111 -0700 PDT m=+14.813015151 (delta=-792.788462ms)
	I0731 09:53:30.978778    2954 fix.go:200] guest clock delta is within tolerance: -792.788462ms
	I0731 09:53:30.978783    2954 start.go:83] releasing machines lock for "ha-393000", held for 14.336915594s
	I0731 09:53:30.978805    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.978937    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:30.979046    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979390    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979496    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979591    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:53:30.979625    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.979645    2954 ssh_runner.go:195] Run: cat /version.json
	I0731 09:53:30.979655    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.979750    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.979786    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.979846    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.979902    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.979927    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.979985    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.980003    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:30.980063    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:31.061693    2954 ssh_runner.go:195] Run: systemctl --version
	I0731 09:53:31.066472    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 09:53:31.070647    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:53:31.070687    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:53:31.084420    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:53:31.084432    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:53:31.084539    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:53:31.099368    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:53:31.108753    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:53:31.117896    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:53:31.117944    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:53:31.126974    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:53:31.135823    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:53:31.144673    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:53:31.153676    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:53:31.162890    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:53:31.171995    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:53:31.181357    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:53:31.190300    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:53:31.198317    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:53:31.206286    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:31.306658    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:53:31.325552    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:53:31.325643    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:53:31.346571    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:53:31.359753    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:53:31.393299    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:53:31.404448    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:53:31.414860    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:53:31.437636    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:53:31.448198    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:53:31.464071    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:53:31.467113    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:53:31.474646    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:53:31.488912    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:53:31.589512    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:53:31.693775    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:53:31.693845    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:53:31.709549    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:31.811094    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:53:34.149023    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.337932224s)
	I0731 09:53:34.149088    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:53:34.161198    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:53:34.175766    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:53:34.187797    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:53:34.283151    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:53:34.377189    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:34.469067    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:53:34.482248    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:53:34.492385    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:34.587912    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:53:34.647834    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:53:34.647904    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:53:34.652204    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:53:34.652250    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:53:34.655108    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:53:34.680326    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:53:34.680403    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:53:34.699387    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:53:34.764313    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:53:34.764369    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:34.764763    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:53:34.769523    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:53:34.780319    2954 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Moun
tType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 09:53:34.780379    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:53:34.780438    2954 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 09:53:34.792271    2954 docker.go:685] Got preloaded images: 
	I0731 09:53:34.792283    2954 docker.go:691] registry.k8s.io/kube-apiserver:v1.30.3 wasn't preloaded
	I0731 09:53:34.792332    2954 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0731 09:53:34.800298    2954 ssh_runner.go:195] Run: which lz4
	I0731 09:53:34.803039    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0731 09:53:34.803157    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0731 09:53:34.806121    2954 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0731 09:53:34.806135    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (359612007 bytes)
	I0731 09:53:35.858525    2954 docker.go:649] duration metric: took 1.055419334s to copy over tarball
	I0731 09:53:35.858591    2954 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0731 09:53:38.196952    2954 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.338365795s)
	I0731 09:53:38.196967    2954 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0731 09:53:38.223533    2954 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0731 09:53:38.232307    2954 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2630 bytes)
	I0731 09:53:38.245888    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:38.355987    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:53:40.705059    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.349073816s)
	I0731 09:53:40.705149    2954 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 09:53:40.718481    2954 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0731 09:53:40.718506    2954 cache_images.go:84] Images are preloaded, skipping loading
	I0731 09:53:40.718529    2954 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 09:53:40.718621    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:53:40.718689    2954 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 09:53:40.756905    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:40.756918    2954 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0731 09:53:40.756931    2954 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 09:53:40.756946    2954 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 09:53:40.757028    2954 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 09:53:40.757045    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:53:40.757094    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:53:40.770142    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:53:40.770212    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:53:40.770264    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:53:40.778467    2954 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 09:53:40.778510    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 09:53:40.786404    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 09:53:40.799629    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:53:40.814270    2954 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 09:53:40.827819    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1446 bytes)
	I0731 09:53:40.841352    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:53:40.844280    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:53:40.854288    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:40.961875    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:53:40.976988    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 09:53:40.977000    2954 certs.go:194] generating shared ca certs ...
	I0731 09:53:40.977011    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:40.977205    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:53:40.977278    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:53:40.977287    2954 certs.go:256] generating profile certs ...
	I0731 09:53:40.977331    2954 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:53:40.977344    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt with IP's: []
	I0731 09:53:41.064733    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt ...
	I0731 09:53:41.064749    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt: {Name:mk11f8b5ec16b878c9f692ccaff9a489ecc76fb2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.065074    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key ...
	I0731 09:53:41.065082    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key: {Name:mk18e6554cf3c807804faf77a7a9620e92860212 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.065322    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9
	I0731 09:53:41.065337    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.254]
	I0731 09:53:41.267360    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 ...
	I0731 09:53:41.267375    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9: {Name:mk9c13a9d071c94395118e1f00f992954683ef5b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.267745    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9 ...
	I0731 09:53:41.267755    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9: {Name:mk49f9f4ab2c1350a3cdb49ded7d6cffd5f069e5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.267965    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:53:41.268145    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:53:41.268307    2954 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:53:41.268320    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt with IP's: []
	I0731 09:53:41.352486    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt ...
	I0731 09:53:41.352499    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt: {Name:mk6759a3c690d7a9e990f65c338d22538c5b127a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.352775    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key ...
	I0731 09:53:41.352788    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key: {Name:mk4f661b46725a943b9862deb5f02f250855a1b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.352992    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:53:41.353021    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:53:41.353040    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:53:41.353059    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:53:41.353078    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:53:41.353096    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:53:41.353115    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:53:41.353132    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:53:41.353229    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:53:41.353280    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:53:41.353289    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:53:41.353319    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:53:41.353348    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:53:41.353377    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:53:41.353444    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:53:41.353475    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.353494    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.353511    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.353950    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:53:41.373611    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:53:41.392573    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:53:41.412520    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:53:41.433349    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0731 09:53:41.452365    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0731 09:53:41.472032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:53:41.491092    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:53:41.510282    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:53:41.529242    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:53:41.549127    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:53:41.568112    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 09:53:41.581548    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:53:41.585729    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:53:41.594979    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.598924    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.598977    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.603300    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:53:41.612561    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:53:41.621665    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.624970    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.625005    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.629117    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:53:41.638283    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:53:41.647422    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.650741    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.650776    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.654995    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:53:41.664976    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:53:41.668030    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:53:41.668072    2954 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTy
pe:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:53:41.668156    2954 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 09:53:41.680752    2954 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 09:53:41.691788    2954 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0731 09:53:41.701427    2954 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0731 09:53:41.710462    2954 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0731 09:53:41.710473    2954 kubeadm.go:157] found existing configuration files:
	
	I0731 09:53:41.710522    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0731 09:53:41.718051    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0731 09:53:41.718109    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0731 09:53:41.726696    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0731 09:53:41.737698    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0731 09:53:41.737751    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0731 09:53:41.745907    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0731 09:53:41.753641    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0731 09:53:41.753680    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0731 09:53:41.761450    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0731 09:53:41.769156    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0731 09:53:41.769207    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0731 09:53:41.777068    2954 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0731 09:53:41.848511    2954 kubeadm.go:310] [init] Using Kubernetes version: v1.30.3
	I0731 09:53:41.848564    2954 kubeadm.go:310] [preflight] Running pre-flight checks
	I0731 09:53:41.937481    2954 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0731 09:53:41.937568    2954 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0731 09:53:41.937658    2954 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0731 09:53:42.093209    2954 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0731 09:53:42.137661    2954 out.go:204]   - Generating certificates and keys ...
	I0731 09:53:42.137715    2954 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0731 09:53:42.137758    2954 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0731 09:53:42.784132    2954 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0731 09:53:42.954915    2954 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0731 09:53:43.064099    2954 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0731 09:53:43.107145    2954 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0731 09:53:43.256550    2954 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0731 09:53:43.256643    2954 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-393000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0731 09:53:43.365808    2954 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0731 09:53:43.365910    2954 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-393000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0731 09:53:43.496987    2954 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0731 09:53:43.811530    2954 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0731 09:53:43.998883    2954 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0731 09:53:43.999156    2954 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0731 09:53:44.246352    2954 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0731 09:53:44.460463    2954 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0731 09:53:44.552236    2954 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0731 09:53:44.656335    2954 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0731 09:53:44.920852    2954 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0731 09:53:44.921188    2954 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0731 09:53:44.922677    2954 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0731 09:53:44.944393    2954 out.go:204]   - Booting up control plane ...
	I0731 09:53:44.944462    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0731 09:53:44.944530    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0731 09:53:44.944583    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0731 09:53:44.944663    2954 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0731 09:53:44.944728    2954 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0731 09:53:44.944759    2954 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0731 09:53:45.048317    2954 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0731 09:53:45.048393    2954 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0731 09:53:45.548165    2954 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 500.802272ms
	I0731 09:53:45.548224    2954 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0731 09:53:51.610602    2954 kubeadm.go:310] [api-check] The API server is healthy after 6.066816222s
	I0731 09:53:51.618854    2954 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0731 09:53:51.625868    2954 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0731 09:53:51.637830    2954 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0731 09:53:51.637998    2954 kubeadm.go:310] [mark-control-plane] Marking the node ha-393000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0731 09:53:51.650953    2954 kubeadm.go:310] [bootstrap-token] Using token: wt4o9v.66pnb4w7anxpqs79
	I0731 09:53:51.687406    2954 out.go:204]   - Configuring RBAC rules ...
	I0731 09:53:51.687587    2954 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0731 09:53:51.690002    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0731 09:53:51.716618    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0731 09:53:51.718333    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0731 09:53:51.720211    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0731 09:53:51.722003    2954 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0731 09:53:52.016537    2954 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0731 09:53:52.431449    2954 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0731 09:53:53.015675    2954 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0731 09:53:53.016431    2954 kubeadm.go:310] 
	I0731 09:53:53.016524    2954 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0731 09:53:53.016539    2954 kubeadm.go:310] 
	I0731 09:53:53.016612    2954 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0731 09:53:53.016623    2954 kubeadm.go:310] 
	I0731 09:53:53.016649    2954 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0731 09:53:53.016721    2954 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0731 09:53:53.016763    2954 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0731 09:53:53.016773    2954 kubeadm.go:310] 
	I0731 09:53:53.016814    2954 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0731 09:53:53.016821    2954 kubeadm.go:310] 
	I0731 09:53:53.016868    2954 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0731 09:53:53.016891    2954 kubeadm.go:310] 
	I0731 09:53:53.016935    2954 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0731 09:53:53.017005    2954 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0731 09:53:53.017059    2954 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0731 09:53:53.017072    2954 kubeadm.go:310] 
	I0731 09:53:53.017139    2954 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0731 09:53:53.017203    2954 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0731 09:53:53.017207    2954 kubeadm.go:310] 
	I0731 09:53:53.017269    2954 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token wt4o9v.66pnb4w7anxpqs79 \
	I0731 09:53:53.017353    2954 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 \
	I0731 09:53:53.017373    2954 kubeadm.go:310] 	--control-plane 
	I0731 09:53:53.017381    2954 kubeadm.go:310] 
	I0731 09:53:53.017452    2954 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0731 09:53:53.017461    2954 kubeadm.go:310] 
	I0731 09:53:53.017528    2954 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token wt4o9v.66pnb4w7anxpqs79 \
	I0731 09:53:53.017610    2954 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 
	I0731 09:53:53.018224    2954 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0731 09:53:53.018239    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:53.018245    2954 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0731 09:53:53.040097    2954 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0731 09:53:53.097376    2954 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0731 09:53:53.101992    2954 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.3/kubectl ...
	I0731 09:53:53.102004    2954 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0731 09:53:53.115926    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0731 09:53:53.335699    2954 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0731 09:53:53.335768    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:53.335769    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000 minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=true
	I0731 09:53:53.489955    2954 ops.go:34] apiserver oom_adj: -16
	I0731 09:53:53.490022    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:53.990085    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:54.490335    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:54.991422    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:55.490608    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:55.990200    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:56.490175    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:56.990807    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:57.491373    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:57.991164    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:58.491587    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:58.990197    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:59.490119    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:59.990444    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:00.490776    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:00.990123    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:01.490685    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:01.991905    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:02.490505    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:02.990148    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:03.490590    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:03.990745    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:04.491071    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:04.991117    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:05.490027    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:05.576301    2954 kubeadm.go:1113] duration metric: took 12.240698872s to wait for elevateKubeSystemPrivileges
	I0731 09:54:05.576324    2954 kubeadm.go:394] duration metric: took 23.908471214s to StartCluster
	I0731 09:54:05.576346    2954 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:05.576441    2954 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:54:05.576993    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:05.577274    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0731 09:54:05.577286    2954 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:05.577302    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:54:05.577319    2954 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 09:54:05.577357    2954 addons.go:69] Setting storage-provisioner=true in profile "ha-393000"
	I0731 09:54:05.577363    2954 addons.go:69] Setting default-storageclass=true in profile "ha-393000"
	I0731 09:54:05.577386    2954 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-393000"
	I0731 09:54:05.577386    2954 addons.go:234] Setting addon storage-provisioner=true in "ha-393000"
	I0731 09:54:05.577408    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:05.577423    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:05.577661    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.577669    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.577675    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.577679    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.587150    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51060
	I0731 09:54:05.587233    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51061
	I0731 09:54:05.587573    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.587584    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.587918    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.587919    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.587930    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.587931    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.588210    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.588232    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.588358    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.588454    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.588531    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.588614    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.588639    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.590714    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:54:05.590994    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 09:54:05.591385    2954 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 09:54:05.591537    2954 addons.go:234] Setting addon default-storageclass=true in "ha-393000"
	I0731 09:54:05.591560    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:05.591783    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.591798    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.597469    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51064
	I0731 09:54:05.597830    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.598161    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.598171    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.598405    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.598520    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.598612    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.598688    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.599681    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:05.600339    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51066
	I0731 09:54:05.600677    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.601035    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.601051    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.601254    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.601611    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.601637    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.610207    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51068
	I0731 09:54:05.610548    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.610892    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.610909    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.611149    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.611266    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.611351    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.611421    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.612421    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:05.612552    2954 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0731 09:54:05.612560    2954 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0731 09:54:05.612568    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:05.612695    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:05.612786    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:05.612891    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:05.612974    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:05.623428    2954 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0731 09:54:05.644440    2954 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0731 09:54:05.644452    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0731 09:54:05.644468    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:05.644630    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:05.644723    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:05.644822    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:05.644921    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:05.653382    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0731 09:54:05.687318    2954 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0731 09:54:05.764200    2954 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0731 09:54:06.182319    2954 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0731 09:54:06.182364    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.182377    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.182560    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.182561    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.182572    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.182582    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.182588    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.182708    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.182715    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.182734    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.182830    2954 round_trippers.go:463] GET https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0731 09:54:06.182842    2954 round_trippers.go:469] Request Headers:
	I0731 09:54:06.182849    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:54:06.182854    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:54:06.189976    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:54:06.190422    2954 round_trippers.go:463] PUT https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0731 09:54:06.190430    2954 round_trippers.go:469] Request Headers:
	I0731 09:54:06.190435    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:54:06.190439    2954 round_trippers.go:473]     Content-Type: application/json
	I0731 09:54:06.190441    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:54:06.192143    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:54:06.192277    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.192285    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.192466    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.192478    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.192482    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318368    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.318380    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.318552    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.318557    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318564    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.318573    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.318591    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.318752    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318752    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.318769    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.354999    2954 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0731 09:54:06.412621    2954 addons.go:510] duration metric: took 835.314471ms for enable addons: enabled=[default-storageclass storage-provisioner]
	I0731 09:54:06.412653    2954 start.go:246] waiting for cluster config update ...
	I0731 09:54:06.412665    2954 start.go:255] writing updated cluster config ...
	I0731 09:54:06.449784    2954 out.go:177] 
	I0731 09:54:06.487284    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:06.487391    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:06.509688    2954 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 09:54:06.585678    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:54:06.585712    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:54:06.585911    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:54:06.585931    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:54:06.586023    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:06.586742    2954 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:54:06.586867    2954 start.go:364] duration metric: took 101.68µs to acquireMachinesLock for "ha-393000-m02"
	I0731 09:54:06.586897    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks
:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:06.586986    2954 start.go:125] createHost starting for "m02" (driver="hyperkit")
	I0731 09:54:06.608709    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:54:06.608788    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:06.608805    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:06.617299    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51073
	I0731 09:54:06.617638    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:06.618011    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:06.618029    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:06.618237    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:06.618326    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:06.618405    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:06.618514    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:54:06.618528    2954 client.go:168] LocalClient.Create starting
	I0731 09:54:06.618559    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:54:06.618609    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:54:06.618620    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:54:06.618668    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:54:06.618707    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:54:06.618717    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:54:06.618731    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:54:06.618737    2954 main.go:141] libmachine: (ha-393000-m02) Calling .PreCreateCheck
	I0731 09:54:06.618808    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:06.618841    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:06.646223    2954 main.go:141] libmachine: Creating machine...
	I0731 09:54:06.646236    2954 main.go:141] libmachine: (ha-393000-m02) Calling .Create
	I0731 09:54:06.646361    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:06.646520    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.646351    2979 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:54:06.646597    2954 main.go:141] libmachine: (ha-393000-m02) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:54:06.831715    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.831641    2979 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa...
	I0731 09:54:06.939142    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.939044    2979 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk...
	I0731 09:54:06.939162    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Writing magic tar header
	I0731 09:54:06.939170    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Writing SSH key tar header
	I0731 09:54:06.940042    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.939949    2979 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02 ...
	I0731 09:54:07.311809    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:07.311824    2954 main.go:141] libmachine: (ha-393000-m02) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 09:54:07.311866    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 09:54:07.337818    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 09:54:07.337835    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:54:07.337884    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:54:07.337912    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:54:07.337954    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:54:07.337986    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:54:07.338000    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:54:07.340860    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Pid is 2980
	I0731 09:54:07.341360    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 09:54:07.341374    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:07.341426    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:07.342343    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:07.342405    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:07.342418    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:07.342433    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:07.342443    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:07.342451    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:07.348297    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:54:07.357913    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:54:07.358688    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:54:07.358712    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:54:07.358723    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:54:07.358740    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:54:07.743017    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:54:07.743035    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:54:07.858034    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:54:07.858062    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:54:07.858072    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:54:07.858084    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:54:07.858884    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:54:07.858896    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:54:09.343775    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 1
	I0731 09:54:09.343792    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:09.343900    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:09.344720    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:09.344781    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:09.344792    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:09.344804    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:09.344817    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:09.344826    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:11.346829    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 2
	I0731 09:54:11.346846    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:11.346940    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:11.347752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:11.347766    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:11.347784    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:11.347795    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:11.347819    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:11.347832    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:13.348981    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 3
	I0731 09:54:13.349001    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:13.349109    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:13.349907    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:13.349943    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:13.349954    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:13.349965    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:13.349972    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:13.349980    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:13.459282    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:54:13.459342    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:54:13.459355    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:54:13.483197    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:54:15.351752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 4
	I0731 09:54:15.351769    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:15.351820    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:15.352675    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:15.352721    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:15.352735    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:15.352744    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:15.352752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:15.352760    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:17.353423    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 5
	I0731 09:54:17.353439    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:17.353530    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:17.354334    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:17.354363    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:54:17.354369    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:54:17.354392    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 09:54:17.354398    2954 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 09:54:17.354469    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:17.355226    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:17.355356    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:17.355457    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:54:17.355466    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:54:17.355564    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:17.355626    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:17.356407    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:54:17.356415    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:54:17.356426    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:54:17.356432    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:17.356529    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:17.356628    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:17.356727    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:17.356823    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:17.356939    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:17.357111    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:17.357118    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:54:18.376907    2954 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0731 09:54:21.440008    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:54:21.440021    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:54:21.440026    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.440157    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.440265    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.440360    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.440445    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.440567    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.440720    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.440728    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:54:21.502840    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:54:21.502894    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:54:21.502900    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:54:21.502905    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.503041    2954 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 09:54:21.503052    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.503150    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.503242    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.503322    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.503392    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.503473    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.503584    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.503728    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.503737    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 09:54:21.579730    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 09:54:21.579745    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.579874    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.579976    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.580070    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.580163    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.580287    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.580427    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.580439    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:54:21.651021    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:54:21.651038    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:54:21.651048    2954 buildroot.go:174] setting up certificates
	I0731 09:54:21.651054    2954 provision.go:84] configureAuth start
	I0731 09:54:21.651061    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.651192    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:21.651290    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.651382    2954 provision.go:143] copyHostCerts
	I0731 09:54:21.651408    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:54:21.651454    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:54:21.651459    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:54:21.651611    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:54:21.651812    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:54:21.651848    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:54:21.651853    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:54:21.651933    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:54:21.652069    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:54:21.652109    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:54:21.652114    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:54:21.652196    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:54:21.652337    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 09:54:21.695144    2954 provision.go:177] copyRemoteCerts
	I0731 09:54:21.695204    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:54:21.695225    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.695363    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.695457    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.695544    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.695616    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:21.734262    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:54:21.734338    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:54:21.760893    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:54:21.760979    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 09:54:21.787062    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:54:21.787131    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:54:21.807971    2954 provision.go:87] duration metric: took 156.910143ms to configureAuth
	I0731 09:54:21.807985    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:54:21.808123    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:21.808137    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:21.808270    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.808350    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.808427    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.808504    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.808592    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.808693    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.808822    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.808830    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:54:21.871923    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:54:21.871936    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:54:21.872014    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:54:21.872025    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.872159    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.872242    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.872339    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.872432    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.872558    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.872693    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.872741    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:54:21.947253    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:54:21.947272    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.947434    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.947533    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.947607    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.947689    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.947845    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.947990    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.948005    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:54:23.521299    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:54:23.521320    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:54:23.521327    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetURL
	I0731 09:54:23.521467    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:54:23.521475    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:54:23.521480    2954 client.go:171] duration metric: took 16.903099578s to LocalClient.Create
	I0731 09:54:23.521492    2954 start.go:167] duration metric: took 16.903132869s to libmachine.API.Create "ha-393000"
	I0731 09:54:23.521498    2954 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 09:54:23.521504    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:54:23.521519    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.521663    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:54:23.521677    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.521769    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.521859    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.521933    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.522032    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:23.560604    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:54:23.563782    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:54:23.563793    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:54:23.563892    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:54:23.564080    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:54:23.564086    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:54:23.564293    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:54:23.571517    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:54:23.591429    2954 start.go:296] duration metric: took 69.922656ms for postStartSetup
	I0731 09:54:23.591460    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:23.592068    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:23.592212    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:23.592596    2954 start.go:128] duration metric: took 17.005735325s to createHost
	I0731 09:54:23.592609    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.592713    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.592826    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.592928    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.593022    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.593148    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:23.593279    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:23.593287    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:54:23.656618    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444863.810880618
	
	I0731 09:54:23.656630    2954 fix.go:216] guest clock: 1722444863.810880618
	I0731 09:54:23.656635    2954 fix.go:229] Guest: 2024-07-31 09:54:23.810880618 -0700 PDT Remote: 2024-07-31 09:54:23.592602 -0700 PDT m=+67.492982270 (delta=218.278618ms)
	I0731 09:54:23.656654    2954 fix.go:200] guest clock delta is within tolerance: 218.278618ms
	I0731 09:54:23.656663    2954 start.go:83] releasing machines lock for "ha-393000-m02", held for 17.069938552s
	I0731 09:54:23.656681    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.656811    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:23.684522    2954 out.go:177] * Found network options:
	I0731 09:54:23.836571    2954 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 09:54:23.866932    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:54:23.866975    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.867861    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.868089    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.868209    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:54:23.868288    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 09:54:23.868332    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:54:23.868439    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 09:54:23.868462    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.868525    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.868708    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.868756    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.868922    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.868944    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.869058    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:23.869081    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.869206    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 09:54:23.904135    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:54:23.904205    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:54:23.927324    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:54:23.927338    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:54:23.927400    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:54:23.970222    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:54:23.978777    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:54:23.987481    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:54:23.987533    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:54:23.996430    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:54:24.004692    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:54:24.012968    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:54:24.021204    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:54:24.030482    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:54:24.038802    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:54:24.047006    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:54:24.055781    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:54:24.063050    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:54:24.072089    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:24.169406    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:54:24.189452    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:54:24.189519    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:54:24.202393    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:54:24.214821    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:54:24.229583    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:54:24.240171    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:54:24.250428    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:54:24.302946    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:54:24.313120    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:54:24.327912    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:54:24.331673    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:54:24.338902    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:54:24.352339    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:54:24.449032    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:54:24.557842    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:54:24.557870    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:54:24.571700    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:24.673137    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:54:27.047079    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.373944592s)
	I0731 09:54:27.047137    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:54:27.057410    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:54:27.071816    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:54:27.082278    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:54:27.176448    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:54:27.277016    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:27.384870    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:54:27.398860    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:54:27.409735    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:27.507837    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:54:27.568313    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:54:27.568381    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:54:27.573262    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:54:27.573320    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:54:27.579109    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:54:27.606116    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:54:27.606208    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:54:27.625621    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:54:27.663443    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:54:27.704938    2954 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 09:54:27.726212    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:27.726560    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:54:27.730336    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:54:27.740553    2954 mustload.go:65] Loading cluster: ha-393000
	I0731 09:54:27.740700    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:27.740921    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:27.740943    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:27.749667    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51097
	I0731 09:54:27.750028    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:27.750384    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:27.750401    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:27.750596    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:27.750732    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:27.750813    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:27.750888    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:27.751853    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:27.752094    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:27.752117    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:27.760565    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51099
	I0731 09:54:27.760882    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:27.761210    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:27.761223    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:27.761435    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:27.761551    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:27.761648    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 09:54:27.761653    2954 certs.go:194] generating shared ca certs ...
	I0731 09:54:27.761672    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.761836    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:54:27.761936    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:54:27.761945    2954 certs.go:256] generating profile certs ...
	I0731 09:54:27.762034    2954 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:54:27.762058    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069
	I0731 09:54:27.762073    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0731 09:54:27.834156    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 ...
	I0731 09:54:27.834169    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069: {Name:mk0062f228b9fa8374eba60d674a49cb0265b988 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.834495    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069 ...
	I0731 09:54:27.834504    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069: {Name:mkd62a5cca652a59908630fd95f20d2e01386237 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.834713    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:54:27.834929    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:54:27.835197    2954 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:54:27.835206    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:54:27.835229    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:54:27.835247    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:54:27.835267    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:54:27.835284    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:54:27.835302    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:54:27.835321    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:54:27.835338    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:54:27.835425    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:54:27.835473    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:54:27.835481    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:54:27.835511    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:54:27.835539    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:54:27.835575    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:54:27.835647    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:54:27.835682    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:54:27.835703    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:54:27.835723    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:27.835762    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:27.835910    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:27.836005    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:27.836102    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:27.836203    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:27.868754    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 09:54:27.872390    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 09:54:27.881305    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 09:54:27.884697    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 09:54:27.893772    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 09:54:27.896980    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 09:54:27.905593    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 09:54:27.908812    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 09:54:27.916605    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 09:54:27.919921    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 09:54:27.927985    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 09:54:27.931223    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 09:54:27.940238    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:54:27.960044    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:54:27.980032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:54:27.999204    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:54:28.018549    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0731 09:54:28.037848    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 09:54:28.057376    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:54:28.076776    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:54:28.096215    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:54:28.115885    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:54:28.135490    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:54:28.154907    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 09:54:28.169275    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 09:54:28.183001    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 09:54:28.196610    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 09:54:28.210320    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 09:54:28.223811    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 09:54:28.237999    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 09:54:28.251767    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:54:28.256201    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:54:28.265361    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.268834    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.268882    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.273194    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:54:28.282819    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:54:28.292122    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.295585    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.295622    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.299894    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:54:28.308965    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:54:28.318848    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.322347    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.322383    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.326657    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:54:28.335765    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:54:28.338885    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:54:28.338923    2954 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 09:54:28.338981    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:54:28.338998    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:54:28.339031    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:54:28.352962    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:54:28.353010    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:54:28.353068    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:54:28.361447    2954 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 09:54:28.361501    2954 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 09:54:28.370031    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet
	I0731 09:54:28.370031    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm
	I0731 09:54:28.370036    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl
	I0731 09:54:31.406224    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:54:31.406308    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:54:31.409804    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 09:54:31.409825    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 09:54:32.215163    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:54:32.215265    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:54:32.218832    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 09:54:32.218858    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 09:54:39.678084    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:54:39.690174    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:54:39.690295    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:54:39.693595    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 09:54:39.693614    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 09:54:39.964594    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 09:54:39.972786    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 09:54:39.986436    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:54:39.999856    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 09:54:40.013590    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:54:40.016608    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:54:40.026617    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:40.125738    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:54:40.142197    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:40.142482    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:40.142512    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:40.151352    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51126
	I0731 09:54:40.151710    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:40.152074    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:40.152091    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:40.152318    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:40.152428    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:40.152528    2954 start.go:317] joinCluster: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 Clu
sterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpira
tion:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:54:40.152603    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0731 09:54:40.152616    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:40.152722    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:40.152805    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:40.152933    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:40.153036    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:40.232831    2954 start.go:343] trying to join control-plane node "m02" to cluster: &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:40.232861    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token heh6bo.7n85cftszx0hevpy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443"
	I0731 09:55:07.963279    2954 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token heh6bo.7n85cftszx0hevpy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443": (27.730638671s)
	I0731 09:55:07.963316    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0731 09:55:08.368958    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000-m02 minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=false
	I0731 09:55:08.452570    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-393000-m02 node-role.kubernetes.io/control-plane:NoSchedule-
	I0731 09:55:08.540019    2954 start.go:319] duration metric: took 28.387749448s to joinCluster
	I0731 09:55:08.540065    2954 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:08.540296    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:08.563232    2954 out.go:177] * Verifying Kubernetes components...
	I0731 09:55:08.603726    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:08.841318    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:55:08.872308    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:55:08.872512    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 09:55:08.872555    2954 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 09:55:08.872732    2954 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 09:55:08.872795    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:08.872800    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:08.872806    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:08.872810    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:08.881842    2954 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 09:55:09.372875    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:09.372888    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:09.372894    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:09.372897    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:09.374975    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:09.872917    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:09.872929    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:09.872935    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:09.872939    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:09.875869    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.372943    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:10.372956    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:10.372964    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:10.372967    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:10.375041    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.874945    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:10.875020    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:10.875035    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:10.875043    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:10.877858    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.878307    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:11.373440    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:11.373461    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:11.373468    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:11.373472    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:11.376182    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:11.874612    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:11.874624    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:11.874630    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:11.874634    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:11.876432    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:12.374085    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:12.374098    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:12.374104    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:12.374107    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:12.376039    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:12.874234    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:12.874246    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:12.874252    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:12.874255    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:12.876210    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:13.374284    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:13.374372    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:13.374387    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:13.374396    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:13.377959    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:13.378403    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:13.873814    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:13.873839    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:13.873850    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:13.873856    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:13.876640    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:14.373497    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:14.373550    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:14.373561    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:14.373570    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:14.376681    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:14.872976    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:14.873065    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:14.873079    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:14.873087    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:14.875607    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:15.373684    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:15.373702    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:15.373711    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:15.373716    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:15.375839    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:15.873002    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:15.873028    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:15.873040    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:15.873049    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:15.876311    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:15.877408    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:16.373017    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:16.373044    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:16.373110    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:16.373119    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:16.376651    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:16.873932    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:16.873951    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:16.873958    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:16.873961    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:16.875945    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:17.372883    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:17.372963    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:17.372979    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:17.372987    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:17.375706    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:17.874312    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:17.874334    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:17.874343    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:17.874381    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:17.876575    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:18.374077    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:18.374176    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:18.374191    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:18.374197    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:18.377131    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:18.377505    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:18.874567    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:18.874589    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:18.874653    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:18.874658    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:18.877221    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:19.373331    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:19.373347    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:19.373387    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:19.373392    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:19.375412    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:19.873283    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:19.873307    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:19.873320    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:19.873326    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:19.876694    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.373050    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:20.373075    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:20.373086    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:20.373096    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:20.376371    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.874379    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:20.874402    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:20.874414    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:20.874421    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:20.877609    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.878167    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:21.373483    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:21.373509    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:21.373520    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:21.373526    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:21.376649    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:21.872794    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:21.872825    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:21.872832    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:21.872837    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:21.874864    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:22.373703    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:22.373721    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:22.373733    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:22.373739    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:22.376275    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:22.872731    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:22.872746    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:22.872752    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:22.872756    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:22.875078    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:23.373989    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:23.374007    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:23.374017    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:23.374021    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:23.376252    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:23.376876    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:23.874071    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:23.874095    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:23.874118    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:23.874128    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:23.877415    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:24.373797    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:24.373828    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:24.373836    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:24.373842    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:24.375723    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:24.873198    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:24.873217    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:24.873239    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:24.873242    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:24.874997    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:25.373864    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:25.373964    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:25.373983    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:25.373993    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:25.376940    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:25.377783    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:25.873066    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:25.873140    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:25.873157    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:25.873167    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:25.876035    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:26.373560    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:26.373582    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:26.373594    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:26.373600    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:26.376763    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:26.872802    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:26.872826    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:26.872847    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:26.872855    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:26.875665    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.372793    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.372848    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.372859    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.372865    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.375283    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.872817    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.872887    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.872897    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.872902    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.875143    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.875477    2954 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 09:55:27.875491    2954 node_ready.go:38] duration metric: took 19.002910931s for node "ha-393000-m02" to be "Ready" ...
	I0731 09:55:27.875498    2954 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:55:27.875539    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:27.875545    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.875550    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.875554    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.884028    2954 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 09:55:27.888275    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.888338    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 09:55:27.888344    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.888351    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.888354    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.895154    2954 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 09:55:27.895668    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.895676    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.895682    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.895685    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.903221    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:55:27.903585    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.903594    2954 pod_ready.go:81] duration metric: took 15.30431ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.903601    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.903644    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 09:55:27.903649    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.903655    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.903659    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.910903    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:55:27.911272    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.911279    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.911284    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.911287    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.912846    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.913176    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.913184    2954 pod_ready.go:81] duration metric: took 9.57768ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.913191    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.913223    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 09:55:27.913228    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.913233    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.913237    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.914947    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.915374    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.915380    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.915386    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.915390    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.916800    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.917134    2954 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.917142    2954 pod_ready.go:81] duration metric: took 3.945951ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.917148    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.917182    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 09:55:27.917186    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.917192    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.917199    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.919108    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.919519    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.919526    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.919532    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.919538    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.920909    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.921212    2954 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.921221    2954 pod_ready.go:81] duration metric: took 4.068426ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.921231    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.073440    2954 request.go:629] Waited for 152.136555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:55:28.073539    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:55:28.073547    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.073555    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.073561    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.075944    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:28.272878    2954 request.go:629] Waited for 196.473522ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:28.272966    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:28.272972    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.272978    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.272981    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.274914    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:28.275308    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:28.275318    2954 pod_ready.go:81] duration metric: took 354.084518ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.275325    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.473409    2954 request.go:629] Waited for 198.051207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:55:28.473441    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:55:28.473447    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.473463    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.473467    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.475323    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:28.673703    2954 request.go:629] Waited for 197.835098ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:28.673754    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:28.673765    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.673772    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.673777    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.676049    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:28.676485    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:28.676497    2954 pod_ready.go:81] duration metric: took 401.169334ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.676504    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.874899    2954 request.go:629] Waited for 198.343236ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:55:28.875005    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:55:28.875014    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.875025    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.875031    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.878371    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:29.072894    2954 request.go:629] Waited for 193.894527ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:29.072997    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:29.073009    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.073020    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.073029    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.075911    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.076354    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.076367    2954 pod_ready.go:81] duration metric: took 399.859987ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.076376    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.273708    2954 request.go:629] Waited for 197.294345ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:55:29.273758    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:55:29.273806    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.273815    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.273819    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.276500    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.473244    2954 request.go:629] Waited for 196.211404ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.473347    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.473355    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.473363    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.473367    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.475855    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.476256    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.476266    2954 pod_ready.go:81] duration metric: took 399.888458ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.476273    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.672987    2954 request.go:629] Waited for 196.670765ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:55:29.673094    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:55:29.673114    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.673128    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.673135    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.676240    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:29.874264    2954 request.go:629] Waited for 197.423472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.874348    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.874352    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.874365    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.874369    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.876229    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:29.876542    2954 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.876551    2954 pod_ready.go:81] duration metric: took 400.273525ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.876557    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.073985    2954 request.go:629] Waited for 197.386483ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:55:30.074064    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:55:30.074071    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.074076    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.074080    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.075934    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:30.274353    2954 request.go:629] Waited for 197.921759ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.274399    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.274408    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.274421    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.274429    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.276767    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:30.277075    2954 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:30.277085    2954 pod_ready.go:81] duration metric: took 400.525562ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.277092    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.474867    2954 request.go:629] Waited for 197.733458ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:55:30.474919    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:55:30.474936    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.474949    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.474958    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.478180    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:30.673620    2954 request.go:629] Waited for 194.924994ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.673658    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.673662    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.673668    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.673674    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.675356    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:30.675625    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:30.675634    2954 pod_ready.go:81] duration metric: took 398.539654ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.675640    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.873712    2954 request.go:629] Waited for 198.03899ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:55:30.873795    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:55:30.873801    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.873807    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.873811    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.875750    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:31.074152    2954 request.go:629] Waited for 197.932145ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:31.074207    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:31.074215    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.074227    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.074234    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.077132    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:31.077723    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:31.077735    2954 pod_ready.go:81] duration metric: took 402.091925ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:31.077744    2954 pod_ready.go:38] duration metric: took 3.202266702s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:55:31.077770    2954 api_server.go:52] waiting for apiserver process to appear ...
	I0731 09:55:31.077872    2954 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:55:31.089706    2954 api_server.go:72] duration metric: took 22.549827849s to wait for apiserver process to appear ...
	I0731 09:55:31.089719    2954 api_server.go:88] waiting for apiserver healthz status ...
	I0731 09:55:31.089735    2954 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:55:31.093731    2954 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:55:31.093774    2954 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 09:55:31.093779    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.093785    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.093789    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.094287    2954 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 09:55:31.094337    2954 api_server.go:141] control plane version: v1.30.3
	I0731 09:55:31.094346    2954 api_server.go:131] duration metric: took 4.622445ms to wait for apiserver health ...
	I0731 09:55:31.094351    2954 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 09:55:31.272834    2954 request.go:629] Waited for 178.447514ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.272864    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.272868    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.272874    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.272879    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.275929    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:31.278922    2954 system_pods.go:59] 17 kube-system pods found
	I0731 09:55:31.278939    2954 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:55:31.278943    2954 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:55:31.278948    2954 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:55:31.278951    2954 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:55:31.278954    2954 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:55:31.278957    2954 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:55:31.278960    2954 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:55:31.278963    2954 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:55:31.278966    2954 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:55:31.278968    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:55:31.278971    2954 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:55:31.278973    2954 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:55:31.278976    2954 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:55:31.278982    2954 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:55:31.278986    2954 system_pods.go:61] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:55:31.278988    2954 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:55:31.278991    2954 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:55:31.278996    2954 system_pods.go:74] duration metric: took 184.642078ms to wait for pod list to return data ...
	I0731 09:55:31.279002    2954 default_sa.go:34] waiting for default service account to be created ...
	I0731 09:55:31.473455    2954 request.go:629] Waited for 194.413647ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:55:31.473487    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:55:31.473492    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.473498    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.473502    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.475460    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:31.475608    2954 default_sa.go:45] found service account: "default"
	I0731 09:55:31.475618    2954 default_sa.go:55] duration metric: took 196.612794ms for default service account to be created ...
	I0731 09:55:31.475624    2954 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 09:55:31.673326    2954 request.go:629] Waited for 197.663631ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.673362    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.673369    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.673377    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.673384    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.676582    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:31.680023    2954 system_pods.go:86] 17 kube-system pods found
	I0731 09:55:31.680035    2954 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:55:31.680039    2954 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:55:31.680042    2954 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:55:31.680045    2954 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:55:31.680048    2954 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:55:31.680051    2954 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:55:31.680054    2954 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:55:31.680057    2954 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:55:31.680060    2954 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:55:31.680063    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:55:31.680067    2954 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:55:31.680070    2954 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:55:31.680073    2954 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:55:31.680076    2954 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:55:31.680079    2954 system_pods.go:89] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:55:31.680082    2954 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:55:31.680085    2954 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:55:31.680089    2954 system_pods.go:126] duration metric: took 204.462284ms to wait for k8s-apps to be running ...
	I0731 09:55:31.680093    2954 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 09:55:31.680137    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:55:31.691384    2954 system_svc.go:56] duration metric: took 11.279108ms WaitForService to wait for kubelet
	I0731 09:55:31.691399    2954 kubeadm.go:582] duration metric: took 23.151526974s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:55:31.691411    2954 node_conditions.go:102] verifying NodePressure condition ...
	I0731 09:55:31.872842    2954 request.go:629] Waited for 181.393446ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 09:55:31.872873    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 09:55:31.872877    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.872884    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.872887    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.875560    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:31.876076    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:55:31.876090    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:55:31.876101    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:55:31.876111    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:55:31.876115    2954 node_conditions.go:105] duration metric: took 184.70211ms to run NodePressure ...
	I0731 09:55:31.876123    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:55:31.876138    2954 start.go:255] writing updated cluster config ...
	I0731 09:55:31.896708    2954 out.go:177] 
	I0731 09:55:31.917824    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:31.917916    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:31.939502    2954 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 09:55:31.981501    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:55:31.981523    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:55:31.981705    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:55:31.981717    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:55:31.981841    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:31.982574    2954 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:55:31.982642    2954 start.go:364] duration metric: took 52.194µs to acquireMachinesLock for "ha-393000-m03"
	I0731 09:55:31.982663    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ing
ress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror:
DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m03 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:31.982776    2954 start.go:125] createHost starting for "m03" (driver="hyperkit")
	I0731 09:55:32.003523    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:55:32.003599    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:32.003626    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:32.012279    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51131
	I0731 09:55:32.012622    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:32.012991    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:32.013008    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:32.013225    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:32.013332    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:32.013417    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:32.013511    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:55:32.013531    2954 client.go:168] LocalClient.Create starting
	I0731 09:55:32.013562    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:55:32.013605    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:55:32.013616    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:55:32.013658    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:55:32.013685    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:55:32.013695    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:55:32.013708    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:55:32.013722    2954 main.go:141] libmachine: (ha-393000-m03) Calling .PreCreateCheck
	I0731 09:55:32.013796    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:32.013821    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:32.024803    2954 main.go:141] libmachine: Creating machine...
	I0731 09:55:32.024819    2954 main.go:141] libmachine: (ha-393000-m03) Calling .Create
	I0731 09:55:32.024954    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:32.025189    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.024948    2993 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:55:32.025311    2954 main.go:141] libmachine: (ha-393000-m03) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:55:32.387382    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.387300    2993 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa...
	I0731 09:55:32.468181    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.468125    2993 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk...
	I0731 09:55:32.468207    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Writing magic tar header
	I0731 09:55:32.468229    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Writing SSH key tar header
	I0731 09:55:32.468792    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.468762    2993 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03 ...
	I0731 09:55:33.078663    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:33.078680    2954 main.go:141] libmachine: (ha-393000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 09:55:33.078716    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 09:55:33.103258    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 09:55:33.103280    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:55:33.103347    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:55:33.103394    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:55:33.103443    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:55:33.103490    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:55:33.103507    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:55:33.106351    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Pid is 2994
	I0731 09:55:33.106790    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 09:55:33.106810    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:33.106894    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:33.107878    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:33.107923    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:33.107940    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:33.107959    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:33.107977    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:33.107995    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:33.108059    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:33.114040    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:55:33.122160    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:55:33.123003    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:55:33.123036    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:55:33.123053    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:55:33.123062    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:55:33.505461    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:55:33.505481    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:55:33.620173    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:55:33.620193    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:55:33.620213    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:55:33.620225    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:55:33.621055    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:55:33.621064    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:55:35.108561    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 1
	I0731 09:55:35.108578    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:35.108664    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:35.109476    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:35.109527    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:35.109535    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:35.109543    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:35.109553    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:35.109564    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:35.109588    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:37.111452    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 2
	I0731 09:55:37.111469    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:37.111534    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:37.112347    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:37.112387    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:37.112400    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:37.112409    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:37.112418    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:37.112431    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:37.112438    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:39.113861    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 3
	I0731 09:55:39.113876    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:39.113989    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:39.114793    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:39.114841    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:39.114854    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:39.114871    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:39.114881    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:39.114894    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:39.114910    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:39.197635    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:55:39.197744    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:55:39.197756    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:55:39.222062    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:55:41.116408    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 4
	I0731 09:55:41.116425    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:41.116529    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:41.117328    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:41.117368    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:41.117376    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:41.117399    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:41.117416    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:41.117425    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:41.117441    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:43.117722    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 5
	I0731 09:55:43.117737    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:43.117828    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:43.118651    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:43.118699    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:55:43.118714    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:55:43.118721    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 09:55:43.118726    2954 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 09:55:43.118795    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:43.119393    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:43.119491    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:43.119572    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:55:43.119580    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:55:43.119659    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:43.119724    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:43.120517    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:55:43.120525    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:55:43.120529    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:55:43.120540    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:43.120627    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:43.120733    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:43.120830    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:43.120937    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:43.121066    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:43.121248    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:43.121256    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:55:44.180872    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:55:44.180885    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:55:44.180891    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.181020    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.181119    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.181200    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.181293    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.181426    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.181579    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.181587    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:55:44.244214    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:55:44.244264    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:55:44.244271    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:55:44.244277    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.244401    2954 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 09:55:44.244413    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.244502    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.244591    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.244669    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.244754    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.244838    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.244957    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.245103    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.245112    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 09:55:44.315698    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 09:55:44.315714    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.315853    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.315950    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.316034    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.316117    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.316237    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.316383    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.316394    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:55:44.383039    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:55:44.383055    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:55:44.383064    2954 buildroot.go:174] setting up certificates
	I0731 09:55:44.383071    2954 provision.go:84] configureAuth start
	I0731 09:55:44.383077    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.383215    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:44.383314    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.383408    2954 provision.go:143] copyHostCerts
	I0731 09:55:44.383435    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:55:44.383482    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:55:44.383490    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:55:44.383608    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:55:44.383821    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:55:44.383853    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:55:44.383859    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:55:44.383930    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:55:44.384107    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:55:44.384137    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:55:44.384146    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:55:44.384214    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:55:44.384364    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 09:55:44.436199    2954 provision.go:177] copyRemoteCerts
	I0731 09:55:44.436250    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:55:44.436265    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.436405    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.436484    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.436578    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.436651    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:44.474166    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:55:44.474251    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:55:44.495026    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:55:44.495089    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:55:44.514528    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:55:44.514597    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:55:44.534382    2954 provision.go:87] duration metric: took 151.304295ms to configureAuth
	I0731 09:55:44.534397    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:55:44.534572    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:44.534587    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:44.534721    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.534815    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.534895    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.534982    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.535063    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.535176    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.535303    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.535311    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:55:44.595832    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:55:44.595845    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:55:44.595915    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:55:44.595926    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.596055    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.596141    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.596224    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.596312    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.596436    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.596585    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.596629    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:55:44.668428    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:55:44.668446    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.668587    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.668687    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.668775    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.668883    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.669009    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.669153    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.669165    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:55:46.245712    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:55:46.245728    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:55:46.245733    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetURL
	I0731 09:55:46.245877    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:55:46.245886    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:55:46.245891    2954 client.go:171] duration metric: took 14.176451747s to LocalClient.Create
	I0731 09:55:46.245904    2954 start.go:167] duration metric: took 14.176491485s to libmachine.API.Create "ha-393000"
	I0731 09:55:46.245910    2954 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 09:55:46.245917    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:55:46.245936    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.246092    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:55:46.246107    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.246216    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.246326    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.246431    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.246511    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:46.290725    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:55:46.294553    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:55:46.294567    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:55:46.294659    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:55:46.294805    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:55:46.294812    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:55:46.294995    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:55:46.303032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:55:46.335630    2954 start.go:296] duration metric: took 89.711926ms for postStartSetup
	I0731 09:55:46.335676    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:46.336339    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:46.336499    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:46.336864    2954 start.go:128] duration metric: took 14.298177246s to createHost
	I0731 09:55:46.336879    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.336971    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.337062    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.337141    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.337213    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.337332    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:46.337451    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:46.337458    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:55:46.398217    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444946.512017695
	
	I0731 09:55:46.398229    2954 fix.go:216] guest clock: 1722444946.512017695
	I0731 09:55:46.398235    2954 fix.go:229] Guest: 2024-07-31 09:55:46.512017695 -0700 PDT Remote: 2024-07-31 09:55:46.336873 -0700 PDT m=+150.181968458 (delta=175.144695ms)
	I0731 09:55:46.398245    2954 fix.go:200] guest clock delta is within tolerance: 175.144695ms
	I0731 09:55:46.398250    2954 start.go:83] releasing machines lock for "ha-393000-m03", held for 14.359697621s
	I0731 09:55:46.398269    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.398407    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:46.418329    2954 out.go:177] * Found network options:
	I0731 09:55:46.439149    2954 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 09:55:46.477220    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 09:55:46.477241    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:55:46.477255    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.477897    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.478058    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.478150    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:55:46.478196    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 09:55:46.478232    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 09:55:46.478262    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:55:46.478353    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 09:55:46.478353    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.478369    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.478511    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.478558    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.478670    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.478731    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.478785    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:46.478828    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.478931    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 09:55:46.512520    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:55:46.512591    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:55:46.558288    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:55:46.558305    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:55:46.558391    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:55:46.574105    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:55:46.582997    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:55:46.591920    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:55:46.591969    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:55:46.600962    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:55:46.610057    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:55:46.619019    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:55:46.627876    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:55:46.637129    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:55:46.646079    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:55:46.655162    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:55:46.664198    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:55:46.672256    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:55:46.680371    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:46.778919    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:55:46.798064    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:55:46.798132    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:55:46.815390    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:55:46.827644    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:55:46.842559    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:55:46.853790    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:55:46.864444    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:55:46.887653    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:55:46.898070    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:55:46.913256    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:55:46.916263    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:55:46.923424    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:55:46.937344    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:55:47.035092    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:55:47.134788    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:55:47.134810    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:55:47.149022    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:47.247660    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:55:49.540717    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.293040269s)
	I0731 09:55:49.540778    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:55:49.551148    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:55:49.563946    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:55:49.574438    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:55:49.675905    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:55:49.777958    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:49.889335    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:55:49.903338    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:55:49.914450    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:50.020127    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:55:50.079269    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:55:50.079351    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:55:50.085411    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:55:50.085468    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:55:50.088527    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:55:50.115874    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:55:50.115947    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:55:50.133371    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:55:50.177817    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:55:50.199409    2954 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 09:55:50.242341    2954 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 09:55:50.263457    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:50.263780    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:55:50.267924    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:55:50.277257    2954 mustload.go:65] Loading cluster: ha-393000
	I0731 09:55:50.277434    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:50.277675    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:50.277699    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:50.286469    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51154
	I0731 09:55:50.286803    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:50.287152    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:50.287174    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:50.287405    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:50.287529    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:55:50.287619    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:50.287687    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:55:50.288682    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:55:50.288947    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:50.288976    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:50.297641    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51156
	I0731 09:55:50.297976    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:50.298336    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:50.298356    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:50.298557    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:50.298695    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:55:50.298796    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 09:55:50.298803    2954 certs.go:194] generating shared ca certs ...
	I0731 09:55:50.298815    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.298953    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:55:50.299004    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:55:50.299013    2954 certs.go:256] generating profile certs ...
	I0731 09:55:50.299104    2954 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:55:50.299126    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 09:55:50.299146    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0731 09:55:50.438174    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb ...
	I0731 09:55:50.438189    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb: {Name:mk221449ac60933abd0b425ad947a6ab1580c0ce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.438543    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb ...
	I0731 09:55:50.438553    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb: {Name:mk1cb7896668e4a7a9edaf8893989143a67a7948 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.438773    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:55:50.438957    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:55:50.439187    2954 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:55:50.439201    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:55:50.439224    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:55:50.439243    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:55:50.439262    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:55:50.439280    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:55:50.439299    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:55:50.439317    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:55:50.439334    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:55:50.439423    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:55:50.439459    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:55:50.439466    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:55:50.439503    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:55:50.439532    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:55:50.439561    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:55:50.439623    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:55:50.439662    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.439683    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.439702    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.439730    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:55:50.439869    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:55:50.439971    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:55:50.440060    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:55:50.440149    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:55:50.470145    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 09:55:50.473304    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 09:55:50.482843    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 09:55:50.486120    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 09:55:50.495117    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 09:55:50.498266    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 09:55:50.507788    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 09:55:50.510913    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 09:55:50.519933    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 09:55:50.523042    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 09:55:50.531891    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 09:55:50.535096    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 09:55:50.544058    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:55:50.564330    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:55:50.585250    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:55:50.605412    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:55:50.625492    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1444 bytes)
	I0731 09:55:50.645935    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 09:55:50.666578    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:55:50.686734    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:55:50.707428    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:55:50.728977    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:55:50.749365    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:55:50.769217    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 09:55:50.782635    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 09:55:50.796452    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 09:55:50.810265    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 09:55:50.823856    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 09:55:50.837713    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 09:55:50.851806    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 09:55:50.865643    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:55:50.869985    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:55:50.878755    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.882092    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.882127    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.886361    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:55:50.894800    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:55:50.903511    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.906902    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.906941    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.911184    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:55:50.919457    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:55:50.927999    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.931344    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.931398    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.935641    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:55:50.944150    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:55:50.947330    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:55:50.947373    2954 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 09:55:50.947432    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:55:50.947450    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:55:50.947488    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:55:50.960195    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:55:50.960253    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:55:50.960307    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:55:50.968017    2954 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 09:55:50.968069    2954 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 09:55:50.975489    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 09:55:50.975509    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:55:50.975519    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:55:50.975557    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:55:50.976020    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:55:50.987294    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:55:50.987330    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 09:55:50.987350    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 09:55:50.987377    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 09:55:50.987399    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 09:55:50.987416    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:55:51.010057    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 09:55:51.010100    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 09:55:51.683575    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 09:55:51.690828    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 09:55:51.704403    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:55:51.718184    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 09:55:51.732058    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:55:51.735039    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:55:51.744606    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:51.842284    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:55:51.858313    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:55:51.858589    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:51.858612    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:51.867825    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51159
	I0731 09:55:51.868326    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:51.868657    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:51.868668    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:51.868882    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:51.868991    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:55:51.869077    2954 start.go:317] joinCluster: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 Clu
sterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:55:51.869219    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0731 09:55:51.869241    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:55:51.869330    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:55:51.869408    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:55:51.869497    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:55:51.869579    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:55:51.957634    2954 start.go:343] trying to join control-plane node "m03" to cluster: &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:51.957691    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 3o7i0i.qey1hcj8w6i3nuyy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443"
	I0731 09:56:20.527748    2954 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 3o7i0i.qey1hcj8w6i3nuyy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443": (28.570050327s)
	I0731 09:56:20.527779    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0731 09:56:20.987700    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000-m03 minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=false
	I0731 09:56:21.064233    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-393000-m03 node-role.kubernetes.io/control-plane:NoSchedule-
	I0731 09:56:21.148165    2954 start.go:319] duration metric: took 29.279096383s to joinCluster
	I0731 09:56:21.148219    2954 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:56:21.148483    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:56:21.189791    2954 out.go:177] * Verifying Kubernetes components...
	I0731 09:56:21.248129    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:56:21.485219    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:56:21.507788    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:56:21.508040    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 09:56:21.508088    2954 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 09:56:21.508300    2954 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 09:56:21.508342    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:21.508347    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:21.508353    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:21.508357    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:21.510586    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:22.008706    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:22.008723    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:22.008734    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:22.008738    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:22.010978    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:22.509350    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:22.509366    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:22.509372    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:22.509375    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:22.511656    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:23.009510    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:23.009526    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:23.009532    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:23.009535    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:23.011420    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:23.508500    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:23.508516    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:23.508523    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:23.508526    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:23.510720    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:23.511145    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:24.009377    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:24.009394    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:24.009439    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:24.009443    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:24.011828    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:24.509345    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:24.509361    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:24.509368    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:24.509372    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:24.511614    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:25.009402    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:25.009418    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:25.009424    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:25.009428    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:25.011344    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:25.508774    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:25.508790    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:25.508797    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:25.508800    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:25.510932    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:25.511292    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:26.008449    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:26.008465    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:26.008471    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:26.008474    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:26.010614    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:26.509754    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:26.509786    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:26.509799    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:26.509805    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:26.512347    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:27.008498    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:27.008592    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:27.008608    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:27.008615    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:27.011956    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:27.509028    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:27.509110    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:27.509125    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:27.509132    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:27.512133    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:27.512700    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:28.008990    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:28.009083    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:28.009097    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:28.009103    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:28.012126    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:28.509594    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:28.509612    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:28.509621    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:28.509625    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:28.512206    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:29.009613    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:29.009628    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:29.009634    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:29.009637    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:29.011661    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:29.509044    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:29.509059    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:29.509065    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:29.509068    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:29.511159    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:30.008831    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:30.008905    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:30.008916    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:30.008922    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:30.011246    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:30.011529    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:30.509817    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:30.509832    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:30.509838    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:30.509846    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:30.511920    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:31.008461    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:31.008483    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:31.008493    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:31.008499    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:31.011053    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:31.509184    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:31.509236    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:31.509247    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:31.509252    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:31.511776    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:32.008486    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:32.008510    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:32.008522    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:32.008531    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:32.011649    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:32.012066    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:32.510023    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:32.510037    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:32.510044    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:32.510048    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:32.512097    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:33.010283    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:33.010301    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:33.010310    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:33.010314    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:33.012927    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:33.509693    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:33.509712    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:33.509722    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:33.509726    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:33.512086    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.008568    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:34.008586    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:34.008594    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:34.008599    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:34.010823    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.509266    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:34.509365    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:34.509380    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:34.509386    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:34.512417    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.512850    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:35.009777    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:35.009792    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:35.009799    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:35.009802    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:35.011859    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:35.508525    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:35.508582    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:35.508590    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:35.508596    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:35.510810    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:36.009838    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:36.009864    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:36.009876    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:36.009881    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:36.012816    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:36.509201    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:36.509215    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:36.509265    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:36.509269    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:36.511244    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:37.010038    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:37.010064    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:37.010077    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:37.010083    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:37.013339    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:37.013728    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:37.509315    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:37.509330    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:37.509336    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:37.509339    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:37.511753    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:38.009336    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:38.009405    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:38.009415    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:38.009428    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:38.011725    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:38.508458    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:38.508483    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:38.508493    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:38.508500    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:38.511720    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:39.008429    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:39.008452    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:39.008459    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:39.008463    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:39.010408    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:39.508530    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:39.508555    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:39.508569    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:39.508577    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:39.511916    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:39.512435    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:40.009629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.009648    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.009663    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.009668    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.011742    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.509939    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.509963    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.509976    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.509982    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.512891    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.513173    2954 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 09:56:40.513182    2954 node_ready.go:38] duration metric: took 19.004877925s for node "ha-393000-m03" to be "Ready" ...
	I0731 09:56:40.513193    2954 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:56:40.513230    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:40.513235    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.513241    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.513244    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.517063    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:40.521698    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.521758    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 09:56:40.521763    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.521769    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.521773    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.524012    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.524507    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.524515    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.524521    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.524525    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.526095    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.526522    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.526532    2954 pod_ready.go:81] duration metric: took 4.820449ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.526539    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.526579    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 09:56:40.526584    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.526589    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.526597    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.528189    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.528737    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.528744    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.528750    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.528754    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.530442    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.530775    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.530784    2954 pod_ready.go:81] duration metric: took 4.239462ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.530790    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.530822    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 09:56:40.530827    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.530833    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.530840    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.532590    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.533050    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.533057    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.533062    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.533066    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.534760    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.535110    2954 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.535119    2954 pod_ready.go:81] duration metric: took 4.323936ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.535125    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.535164    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 09:56:40.535170    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.535175    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.535178    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.536947    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.537444    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:40.537451    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.537456    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.537460    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.539136    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.539571    2954 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.539580    2954 pod_ready.go:81] duration metric: took 4.45006ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.539587    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.710116    2954 request.go:629] Waited for 170.494917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 09:56:40.710174    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 09:56:40.710180    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.710187    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.710190    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.712323    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.910582    2954 request.go:629] Waited for 197.870555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.910719    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.910732    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.910743    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.910750    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.913867    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:40.914265    2954 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.914278    2954 pod_ready.go:81] duration metric: took 374.68494ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.914293    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.110758    2954 request.go:629] Waited for 196.414025ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:56:41.110829    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:56:41.110835    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.110841    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.110844    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.112890    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:41.311962    2954 request.go:629] Waited for 198.609388ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:41.311995    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:41.312000    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.312006    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.312010    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.314041    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:41.314399    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:41.314410    2954 pod_ready.go:81] duration metric: took 400.109149ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.314418    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.511371    2954 request.go:629] Waited for 196.905615ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:56:41.511497    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:56:41.511508    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.511519    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.511526    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.514702    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:41.710099    2954 request.go:629] Waited for 194.801702ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:41.710131    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:41.710137    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.710143    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.710148    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.711902    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:41.712201    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:41.712211    2954 pod_ready.go:81] duration metric: took 397.788368ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.712225    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.910472    2954 request.go:629] Waited for 198.191914ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 09:56:41.910629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 09:56:41.910640    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.910651    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.910657    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.913895    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:42.111114    2954 request.go:629] Waited for 196.678487ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:42.111206    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:42.111214    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.111222    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.111228    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.113500    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.113867    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.113876    2954 pod_ready.go:81] duration metric: took 401.646528ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.113883    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.310054    2954 request.go:629] Waited for 196.129077ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:56:42.310144    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:56:42.310151    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.310157    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.310161    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.312081    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:42.510104    2954 request.go:629] Waited for 197.491787ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:42.510220    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:42.510230    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.510241    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.510249    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.512958    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.513508    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.513521    2954 pod_ready.go:81] duration metric: took 399.632057ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.513531    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.710421    2954 request.go:629] Waited for 196.851281ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:56:42.710510    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:56:42.710517    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.710523    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.710527    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.713018    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.910158    2954 request.go:629] Waited for 196.774024ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:42.910295    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:42.910307    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.910319    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.910327    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.913021    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.913406    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.913416    2954 pod_ready.go:81] duration metric: took 399.880068ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.913423    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.110445    2954 request.go:629] Waited for 196.965043ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 09:56:43.110548    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 09:56:43.110603    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.110615    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.110630    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.113588    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.311083    2954 request.go:629] Waited for 196.925492ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:43.311134    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:43.311139    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.311146    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.311149    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.313184    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.313462    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:43.313472    2954 pod_ready.go:81] duration metric: took 400.04465ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.313479    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.510584    2954 request.go:629] Waited for 197.060501ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:56:43.510710    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:56:43.510722    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.510731    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.510737    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.513575    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.710025    2954 request.go:629] Waited for 195.991998ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:43.710104    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:43.710111    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.710117    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.710121    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.712314    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.712653    2954 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:43.712663    2954 pod_ready.go:81] duration metric: took 399.178979ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.712670    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.910041    2954 request.go:629] Waited for 197.319656ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 09:56:43.910085    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 09:56:43.910092    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.910100    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.910108    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.913033    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.110409    2954 request.go:629] Waited for 196.775647ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:44.110512    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:44.110520    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.110526    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.110530    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.112726    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.113050    2954 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.113060    2954 pod_ready.go:81] duration metric: took 400.385455ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.113067    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.310143    2954 request.go:629] Waited for 197.043092ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:56:44.310236    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:56:44.310243    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.310253    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.310258    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.312471    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.510561    2954 request.go:629] Waited for 197.642859ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.510715    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.510728    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.510742    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.510750    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.513815    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:44.514349    2954 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.514363    2954 pod_ready.go:81] duration metric: took 401.290361ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.514372    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.711407    2954 request.go:629] Waited for 196.995177ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:56:44.711475    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:56:44.711482    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.711488    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.711491    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.713573    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.910056    2954 request.go:629] Waited for 196.042855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.910095    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.910103    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.910112    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.910117    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.912608    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.912924    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.912934    2954 pod_ready.go:81] duration metric: took 398.555138ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.912941    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.112001    2954 request.go:629] Waited for 199.012783ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:56:45.112114    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:56:45.112125    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.112136    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.112142    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.115328    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:45.310138    2954 request.go:629] Waited for 194.249421ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:45.310197    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:45.310207    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.310217    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.310226    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.315131    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:45.315432    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:45.315442    2954 pod_ready.go:81] duration metric: took 402.495485ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.315449    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.510510    2954 request.go:629] Waited for 195.017136ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 09:56:45.510595    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 09:56:45.510601    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.510607    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.510614    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.512663    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:45.709970    2954 request.go:629] Waited for 196.900157ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:45.710056    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:45.710063    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.710069    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.710073    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.712279    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:45.712540    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:45.712550    2954 pod_ready.go:81] duration metric: took 397.095893ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.712557    2954 pod_ready.go:38] duration metric: took 5.199358243s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:56:45.712568    2954 api_server.go:52] waiting for apiserver process to appear ...
	I0731 09:56:45.712620    2954 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:56:45.724210    2954 api_server.go:72] duration metric: took 24.575970869s to wait for apiserver process to appear ...
	I0731 09:56:45.724224    2954 api_server.go:88] waiting for apiserver healthz status ...
	I0731 09:56:45.724236    2954 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:56:45.729801    2954 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:56:45.729848    2954 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 09:56:45.729855    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.729862    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.729867    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.731097    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:45.731132    2954 api_server.go:141] control plane version: v1.30.3
	I0731 09:56:45.731141    2954 api_server.go:131] duration metric: took 6.912618ms to wait for apiserver health ...
	I0731 09:56:45.731147    2954 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 09:56:45.910423    2954 request.go:629] Waited for 179.236536ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:45.910520    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:45.910529    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.910537    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.910541    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.914926    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:45.919715    2954 system_pods.go:59] 24 kube-system pods found
	I0731 09:56:45.919728    2954 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:56:45.919732    2954 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:56:45.919735    2954 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:56:45.919738    2954 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:56:45.919742    2954 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 09:56:45.919745    2954 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:56:45.919748    2954 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:56:45.919750    2954 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 09:56:45.919753    2954 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:56:45.919756    2954 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:56:45.919759    2954 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 09:56:45.919761    2954 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:56:45.919764    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:56:45.919767    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 09:56:45.919770    2954 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:56:45.919773    2954 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 09:56:45.919776    2954 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:56:45.919778    2954 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:56:45.919780    2954 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:56:45.919783    2954 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 09:56:45.919785    2954 system_pods.go:61] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:56:45.919789    2954 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:56:45.919792    2954 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 09:56:45.919795    2954 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:56:45.919799    2954 system_pods.go:74] duration metric: took 188.647794ms to wait for pod list to return data ...
	I0731 09:56:45.919808    2954 default_sa.go:34] waiting for default service account to be created ...
	I0731 09:56:46.110503    2954 request.go:629] Waited for 190.648848ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:56:46.110629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:56:46.110641    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.110653    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.110659    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.113864    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:46.113948    2954 default_sa.go:45] found service account: "default"
	I0731 09:56:46.113959    2954 default_sa.go:55] duration metric: took 194.145984ms for default service account to be created ...
	I0731 09:56:46.113966    2954 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 09:56:46.310339    2954 request.go:629] Waited for 196.331355ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:46.310381    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:46.310387    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.310420    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.310424    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.314581    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:46.318894    2954 system_pods.go:86] 24 kube-system pods found
	I0731 09:56:46.318910    2954 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:56:46.318914    2954 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:56:46.318918    2954 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:56:46.318921    2954 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:56:46.318926    2954 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 09:56:46.318931    2954 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:56:46.318934    2954 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:56:46.318939    2954 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 09:56:46.318942    2954 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:56:46.318946    2954 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:56:46.318950    2954 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 09:56:46.318955    2954 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:56:46.318958    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:56:46.318963    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 09:56:46.318966    2954 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:56:46.318970    2954 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 09:56:46.318973    2954 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:56:46.318976    2954 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:56:46.318980    2954 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:56:46.318983    2954 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 09:56:46.318987    2954 system_pods.go:89] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:56:46.318990    2954 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:56:46.318993    2954 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 09:56:46.318996    2954 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:56:46.319002    2954 system_pods.go:126] duration metric: took 205.029246ms to wait for k8s-apps to be running ...
	I0731 09:56:46.319007    2954 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 09:56:46.319063    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:56:46.330197    2954 system_svc.go:56] duration metric: took 11.183343ms WaitForService to wait for kubelet
	I0731 09:56:46.330213    2954 kubeadm.go:582] duration metric: took 25.181975511s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:56:46.330225    2954 node_conditions.go:102] verifying NodePressure condition ...
	I0731 09:56:46.509976    2954 request.go:629] Waited for 179.711714ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 09:56:46.510033    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 09:56:46.510039    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.510045    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.510049    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.512677    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:46.513343    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513352    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513358    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513361    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513364    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513367    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513371    2954 node_conditions.go:105] duration metric: took 183.142994ms to run NodePressure ...
	I0731 09:56:46.513378    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:56:46.513392    2954 start.go:255] writing updated cluster config ...
	I0731 09:56:46.513784    2954 ssh_runner.go:195] Run: rm -f paused
	I0731 09:56:46.555311    2954 start.go:600] kubectl: 1.29.2, cluster: 1.30.3 (minor skew: 1)
	I0731 09:56:46.577040    2954 out.go:177] * Done! kubectl is now configured to use "ha-393000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/25b3d6db405f49d365d6f33539e94ee4547921a7d0c463b94585056341530cda/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/c2a288a20831d0407ed1a2c3eeeb19a9758ef98813b916541258c8c58bcce38c/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/480020f5f9c0ce2e553e007beff5dfbe53b17bd2beaa73039be50701f04b9e76/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428712215Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428950502Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428960130Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.429078581Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477484798Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477564679Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477577219Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477869035Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507078466Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507147792Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507166914Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507244276Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853207982Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853706000Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853772518Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.854059851Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:56:47Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/e9ce137a2245c1333d3f3961469d32237e88656784f689211ed86cae2fd5518f/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Jul 31 16:56:49 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:56:49Z" level=info msg="Stop pulling image gcr.io/k8s-minikube/busybox:1.28: Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:1.28"
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157487366Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157549945Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157563641Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.158058722Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   About a minute ago   Running             busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         3 minutes ago        Running             coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	6d966e37d3618       6e38f40d628db                                                                                         3 minutes ago        Running             storage-provisioner       0                   25b3d6db405f4       storage-provisioner
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              4 minutes ago        Running             kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         4 minutes ago        Running             kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	e68314e525ef8       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     4 minutes ago        Running             kube-vip                  0                   c9f21d49b1384       kube-vip-ha-393000
	ab4f453cbe097       1f6d574d502f3                                                                                         4 minutes ago        Running             kube-apiserver            0                   7dc7f319faa98       kube-apiserver-ha-393000
	63e56744c84ee       3861cfcd7c04c                                                                                         4 minutes ago        Running             etcd                      0                   f8f20b1290499       etcd-ha-393000
	e19f7878939c9       76932a3b37d7e                                                                                         4 minutes ago        Running             kube-controller-manager   0                   67c995d2d2a3b       kube-controller-manager-ha-393000
	65412448c586b       3edc18e7b7672                                                                                         4 minutes ago        Running             kube-scheduler            0                   7ab9affa89eca       kube-scheduler-ha-393000
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:34336 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000091143s
	[INFO] 10.244.2.2:60404 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000085158s
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	
	
	==> coredns [feda36fb8a03] <==
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:43418 - 53237 "HINFO IN 5926041632293031093.721085148118182160. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.013101738s
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	
	
	==> describe nodes <==
	Name:               ha-393000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:53:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:08 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:54:24 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-393000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 baf02d554c20474b9fadb280fa1b8544
	  System UUID:                2cfe48dd-0000-0000-9b98-537ad9823a95
	  Boot ID:                    d6aa7e74-2f58-4a9d-a5df-37153dda8239
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-b94zr              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         89s
	  kube-system                 coredns-7db6d8ff4d-5m8st             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     4m11s
	  kube-system                 coredns-7db6d8ff4d-wvqjl             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     4m11s
	  kube-system                 etcd-ha-393000                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         4m25s
	  kube-system                 kindnet-hjm7c                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      4m11s
	  kube-system                 kube-apiserver-ha-393000             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m25s
	  kube-system                 kube-controller-manager-ha-393000    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m26s
	  kube-system                 kube-proxy-zc52f                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m11s
	  kube-system                 kube-scheduler-ha-393000             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m25s
	  kube-system                 kube-vip-ha-393000                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m28s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m10s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 4m9s   kube-proxy       
	  Normal  Starting                 4m25s  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  4m25s  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  4m25s  kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m25s  kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m25s  kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           4m12s  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeReady                3m52s  kubelet          Node ha-393000 status is now: NodeReady
	  Normal  RegisteredNode           2m53s  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           101s   node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	
	
	Name:               ha-393000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:55:06 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:10 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:27 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-393000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 ef1036a76f3140bd891095c317498193
	  System UUID:                7863443c-0000-0000-8e8d-bbd47bc06547
	  Boot ID:                    d1d2508d-2745-4c36-9513-9d28d75304e0
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-zln22                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         89s
	  kube-system                 etcd-ha-393000-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         3m8s
	  kube-system                 kindnet-lcwbs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      3m10s
	  kube-system                 kube-apiserver-ha-393000-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m8s
	  kube-system                 kube-controller-manager-ha-393000-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m8s
	  kube-system                 kube-proxy-cf577                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m10s
	  kube-system                 kube-scheduler-ha-393000-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m8s
	  kube-system                 kube-vip-ha-393000-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m6s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 3m5s                   kube-proxy       
	  Normal  NodeHasSufficientMemory  3m10s (x8 over 3m10s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    3m10s (x8 over 3m10s)  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     3m10s (x7 over 3m10s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  3m10s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           3m7s                   node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal  RegisteredNode           2m53s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal  RegisteredNode           101s                   node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	
	
	Name:               ha-393000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:56:18 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:10 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:40 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-393000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 86f4bf9242d1461e9aec7b900dfd2277
	  System UUID:                451d42a6-0000-0000-8ccb-b8851dda0594
	  Boot ID:                    07f25a3c-b688-461e-9d49-0a60051d0c3c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-n8d7h                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         89s
	  kube-system                 etcd-ha-393000-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         116s
	  kube-system                 kindnet-s2pv6                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      118s
	  kube-system                 kube-apiserver-ha-393000-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         117s
	  kube-system                 kube-controller-manager-ha-393000-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         116s
	  kube-system                 kube-proxy-cr9pg                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         118s
	  kube-system                 kube-scheduler-ha-393000-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         117s
	  kube-system                 kube-vip-ha-393000-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         114s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 114s                 kube-proxy       
	  Normal  NodeHasSufficientMemory  118s (x8 over 118s)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    118s (x8 over 118s)  kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     118s (x7 over 118s)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  118s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           117s                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           113s                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           101s                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	
	
	==> dmesg <==
	[  +2.764750] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.236579] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.776173] systemd-fstab-generator[496]: Ignoring "noauto" option for root device
	[  +0.099418] systemd-fstab-generator[508]: Ignoring "noauto" option for root device
	[  +1.822617] systemd-fstab-generator[843]: Ignoring "noauto" option for root device
	[  +0.280031] systemd-fstab-generator[881]: Ignoring "noauto" option for root device
	[  +0.062769] kauditd_printk_skb: 95 callbacks suppressed
	[  +0.051458] systemd-fstab-generator[893]: Ignoring "noauto" option for root device
	[  +0.120058] systemd-fstab-generator[907]: Ignoring "noauto" option for root device
	[  +2.468123] systemd-fstab-generator[1123]: Ignoring "noauto" option for root device
	[  +0.099873] systemd-fstab-generator[1135]: Ignoring "noauto" option for root device
	[  +0.092257] systemd-fstab-generator[1147]: Ignoring "noauto" option for root device
	[  +0.106918] systemd-fstab-generator[1162]: Ignoring "noauto" option for root device
	[  +3.770701] systemd-fstab-generator[1268]: Ignoring "noauto" option for root device
	[  +0.056009] kauditd_printk_skb: 180 callbacks suppressed
	[  +2.552095] systemd-fstab-generator[1523]: Ignoring "noauto" option for root device
	[  +4.084188] systemd-fstab-generator[1702]: Ignoring "noauto" option for root device
	[  +0.054525] kauditd_printk_skb: 70 callbacks suppressed
	[  +7.033653] systemd-fstab-generator[2202]: Ignoring "noauto" option for root device
	[  +0.072815] kauditd_printk_skb: 72 callbacks suppressed
	[Jul31 16:54] kauditd_printk_skb: 12 callbacks suppressed
	[ +19.132251] kauditd_printk_skb: 38 callbacks suppressed
	[Jul31 16:55] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [63e56744c84e] <==
	{"level":"info","ts":"2024-07-31T16:55:08.065421Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-07-31T16:55:08.065437Z","caller":"etcdserver/server.go:1946","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T16:56:18.524077Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(2035864250365333051 13314548521573537860) learners=(14707668837576794450)"}
	{"level":"info","ts":"2024-07-31T16:56:18.525183Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844","added-peer-id":"cc1c22e219d8e152","added-peer-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-07-31T16:56:18.525227Z","caller":"rafthttp/peer.go:133","msg":"starting remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.525267Z","caller":"rafthttp/pipeline.go:72","msg":"started HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.525776Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.52608Z","caller":"rafthttp/peer.go:137","msg":"started remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.526181Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.526208Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:18.526232Z","caller":"rafthttp/transport.go:317","msg":"added remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-urls":["https://192.169.0.7:2380"]}
	{"level":"info","ts":"2024-07-31T16:56:18.526495Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"warn","ts":"2024-07-31T16:56:18.572765Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"cc1c22e219d8e152","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"warn","ts":"2024-07-31T16:56:19.066544Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"cc1c22e219d8e152","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-07-31T16:56:19.78495Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:19.785013Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:19.792429Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:19.81362Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"cc1c22e219d8e152","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-07-31T16:56:19.813712Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T16:56:19.850768Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"cc1c22e219d8e152","stream-type":"stream Message"}
	{"level":"info","ts":"2024-07-31T16:56:19.850881Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"warn","ts":"2024-07-31T16:56:20.066154Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"cc1c22e219d8e152","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-07-31T16:56:20.566995Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(2035864250365333051 13314548521573537860 14707668837576794450)"}
	{"level":"info","ts":"2024-07-31T16:56:20.567341Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-07-31T16:56:20.567501Z","caller":"etcdserver/server.go:1946","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"cc1c22e219d8e152"}
	
	
	==> kernel <==
	 16:58:16 up 4 min,  0 users,  load average: 0.70, 0.47, 0.20
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:57:30.116433       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:57:40.110429       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:57:40.110467       1 main.go:299] handling current node
	I0731 16:57:40.110480       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:57:40.110484       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:57:40.110718       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:57:40.110749       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:57:50.109884       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:57:50.109964       1 main.go:299] handling current node
	I0731 16:57:50.109980       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:57:50.110115       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:57:50.110404       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:57:50.110446       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:58:00.116121       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:58:00.116198       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:58:00.116281       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:58:00.116321       1 main.go:299] handling current node
	I0731 16:58:00.116341       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:58:00.116353       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:58:10.110132       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:58:10.110172       1 main.go:299] handling current node
	I0731 16:58:10.110185       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:58:10.110190       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:58:10.110340       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:58:10.110368       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [ab4f453cbe09] <==
	I0731 16:53:49.787246       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0731 16:53:49.838971       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0731 16:53:49.842649       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0731 16:53:49.843317       1 controller.go:615] quota admission added evaluator for: endpoints
	I0731 16:53:49.845885       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0731 16:53:50.451090       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0731 16:53:51.578858       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0731 16:53:51.587918       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0731 16:53:51.594571       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0731 16:54:05.505988       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	I0731 16:54:05.655031       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	E0731 16:56:52.014947       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51195: use of closed network connection
	E0731 16:56:52.206354       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51197: use of closed network connection
	E0731 16:56:52.403109       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51199: use of closed network connection
	E0731 16:56:52.600256       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51201: use of closed network connection
	E0731 16:56:52.785054       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51203: use of closed network connection
	E0731 16:56:53.004706       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51205: use of closed network connection
	E0731 16:56:53.208399       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51207: use of closed network connection
	E0731 16:56:53.392187       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51209: use of closed network connection
	E0731 16:56:53.714246       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51212: use of closed network connection
	E0731 16:56:53.895301       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51214: use of closed network connection
	E0731 16:56:54.078794       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51216: use of closed network connection
	E0731 16:56:54.262767       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51218: use of closed network connection
	E0731 16:56:54.448344       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51220: use of closed network connection
	E0731 16:56:54.629926       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51222: use of closed network connection
	
	
	==> kube-controller-manager [e19f7878939c] <==
	I0731 16:54:25.766270       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="26.902µs"
	I0731 16:54:29.808610       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0731 16:55:06.430472       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m02\" does not exist"
	I0731 16:55:06.448216       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m02" podCIDRs=["10.244.1.0/24"]
	I0731 16:55:09.814349       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m02"
	E0731 16:56:18.277948       1 certificate_controller.go:146] Sync csr-v42tm failed with : error updating signature for csr: Operation cannot be fulfilled on certificatesigningrequests.certificates.k8s.io "csr-v42tm": the object has been modified; please apply your changes to the latest version and try again
	I0731 16:56:18.384134       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m03\" does not exist"
	I0731 16:56:18.398095       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m03" podCIDRs=["10.244.2.0/24"]
	I0731 16:56:19.822872       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m03"
	I0731 16:56:47.522324       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="152.941157ms"
	I0731 16:56:47.574976       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="52.539469ms"
	I0731 16:56:47.678922       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="103.895055ms"
	I0731 16:56:47.701560       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="22.534098ms"
	I0731 16:56:47.701787       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="74.391µs"
	I0731 16:56:47.718186       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="8.079697ms"
	I0731 16:56:47.718269       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="39.867µs"
	I0731 16:56:47.744772       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="8.73015ms"
	I0731 16:56:47.745065       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="34.302µs"
	I0731 16:56:48.288860       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.605µs"
	I0731 16:56:49.532986       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.769402ms"
	I0731 16:56:49.533229       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="37.061µs"
	I0731 16:56:49.677499       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.411426ms"
	I0731 16:56:49.677560       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="21.894µs"
	I0731 16:56:51.343350       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="15.340858ms"
	I0731 16:56:51.343434       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.532µs"
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [65412448c586] <==
	W0731 16:53:48.491080       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0731 16:53:48.491132       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0731 16:53:48.491335       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:48.491387       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0731 16:53:48.491507       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0731 16:53:48.491594       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0731 16:53:48.491662       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:48.491738       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:48.491818       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:48.491860       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:48.491537       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:48.491873       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.319781       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0731 16:53:49.319838       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0731 16:53:49.326442       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.326478       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.392116       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:49.392172       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:49.496014       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.496036       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.541411       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:49.541927       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:49.588695       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:49.588735       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0731 16:53:49.982415       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 31 16:54:25 ha-393000 kubelet[2209]: I0731 16:54:25.725648    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-5m8st" podStartSLOduration=20.725636637 podStartE2EDuration="20.725636637s" podCreationTimestamp="2024-07-31 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.724822579 +0000 UTC m=+33.396938994" watchObservedRunningTime="2024-07-31 16:54:25.725636637 +0000 UTC m=+33.397753046"
	Jul 31 16:54:25 ha-393000 kubelet[2209]: I0731 16:54:25.753514    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/storage-provisioner" podStartSLOduration=19.753503033 podStartE2EDuration="19.753503033s" podCreationTimestamp="2024-07-31 16:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.752974741 +0000 UTC m=+33.425091155" watchObservedRunningTime="2024-07-31 16:54:25.753503033 +0000 UTC m=+33.425619443"
	Jul 31 16:54:52 ha-393000 kubelet[2209]: E0731 16:54:52.468990    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:54:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:55:52 ha-393000 kubelet[2209]: E0731 16:55:52.468170    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:55:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.510532    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-wvqjl" podStartSLOduration=162.510247367 podStartE2EDuration="2m42.510247367s" podCreationTimestamp="2024-07-31 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.761498183 +0000 UTC m=+33.433614594" watchObservedRunningTime="2024-07-31 16:56:47.510247367 +0000 UTC m=+175.182363776"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.510944    2209 topology_manager.go:215] "Topology Admit Handler" podUID="dd382c29-63af-44cb-bf5b-b7db27f11017" podNamespace="default" podName="busybox-fc5497c4f-b94zr"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.640155    2209 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8k4\" (UniqueName: \"kubernetes.io/projected/dd382c29-63af-44cb-bf5b-b7db27f11017-kube-api-access-cp8k4\") pod \"busybox-fc5497c4f-b94zr\" (UID: \"dd382c29-63af-44cb-bf5b-b7db27f11017\") " pod="default/busybox-fc5497c4f-b94zr"
	Jul 31 16:56:52 ha-393000 kubelet[2209]: E0731 16:56:52.472632    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:56:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:57:52 ha-393000 kubelet[2209]: E0731 16:57:52.468077    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:57:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:261: (dbg) Run:  kubectl --context ha-393000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/CopyFile FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/CopyFile (3.28s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (11.78s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:363: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 node stop m02 -v=7 --alsologtostderr
ha_test.go:363: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 node stop m02 -v=7 --alsologtostderr: (8.366107066s)
ha_test.go:369: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:369: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 7 (353.545494ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:58:25.963079    3166 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:58:25.963331    3166 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:58:25.963337    3166 out.go:304] Setting ErrFile to fd 2...
	I0731 09:58:25.963340    3166 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:58:25.963511    3166 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:58:25.963681    3166 out.go:298] Setting JSON to false
	I0731 09:58:25.963704    3166 mustload.go:65] Loading cluster: ha-393000
	I0731 09:58:25.963745    3166 notify.go:220] Checking for updates...
	I0731 09:58:25.964012    3166 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:58:25.964028    3166 status.go:255] checking status of ha-393000 ...
	I0731 09:58:25.964395    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:25.964458    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:25.973132    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51364
	I0731 09:58:25.973444    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:25.973827    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:25.973838    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:25.974059    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:25.974186    3166 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:58:25.974281    3166 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:58:25.974340    3166 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:58:25.975319    3166 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:58:25.975339    3166 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:58:25.975596    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:25.975616    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:25.984086    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51366
	I0731 09:58:25.984425    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:25.984756    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:25.984777    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:25.984977    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:25.985092    3166 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:58:25.985165    3166 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:58:25.985415    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:25.985437    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:25.994003    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51368
	I0731 09:58:25.994303    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:25.994658    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:25.994673    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:25.994862    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:25.994955    3166 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:58:25.995097    3166 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:58:25.995114    3166 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:58:25.995193    3166 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:58:25.995267    3166 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:58:25.995352    3166 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:58:25.995434    3166 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:58:26.032140    3166 ssh_runner.go:195] Run: systemctl --version
	I0731 09:58:26.036385    3166 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:58:26.047794    3166 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:58:26.047818    3166 api_server.go:166] Checking apiserver status ...
	I0731 09:58:26.047852    3166 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:58:26.059780    3166 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:58:26.067013    3166 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:58:26.067059    3166 ssh_runner.go:195] Run: ls
	I0731 09:58:26.070016    3166 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:58:26.073068    3166 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:58:26.073078    3166 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:58:26.073088    3166 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:58:26.073101    3166 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:58:26.073337    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:26.073357    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:26.081879    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51372
	I0731 09:58:26.082229    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:26.082560    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:26.082584    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:26.082790    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:26.082909    3166 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:58:26.082991    3166 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:58:26.083068    3166 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:58:26.084059    3166 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 2980 missing from process table
	I0731 09:58:26.084076    3166 status.go:330] ha-393000-m02 host status = "Stopped" (err=<nil>)
	I0731 09:58:26.084084    3166 status.go:343] host is not running, skipping remaining checks
	I0731 09:58:26.084091    3166 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:58:26.084104    3166 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:58:26.084374    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:26.084407    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:26.092803    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51374
	I0731 09:58:26.093137    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:26.093458    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:26.093469    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:26.093679    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:26.093797    3166 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:58:26.093906    3166 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:58:26.093968    3166 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:58:26.094965    3166 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:58:26.094973    3166 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:58:26.095222    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:26.095242    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:26.103692    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51376
	I0731 09:58:26.104115    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:26.104492    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:26.104511    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:26.104730    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:26.104844    3166 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:58:26.104935    3166 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:58:26.105192    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:26.105220    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:26.113664    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51378
	I0731 09:58:26.114004    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:26.114346    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:26.114375    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:26.114601    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:26.114713    3166 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:58:26.114855    3166 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:58:26.114867    3166 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:58:26.114952    3166 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:58:26.115043    3166 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:58:26.115117    3166 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:58:26.115197    3166 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:58:26.151005    3166 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:58:26.162594    3166 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:58:26.162608    3166 api_server.go:166] Checking apiserver status ...
	I0731 09:58:26.162643    3166 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:58:26.174697    3166 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:58:26.183093    3166 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:58:26.183160    3166 ssh_runner.go:195] Run: ls
	I0731 09:58:26.186265    3166 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:58:26.189401    3166 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:58:26.189412    3166 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:58:26.189421    3166 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:58:26.189438    3166 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:58:26.189697    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:26.189716    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:26.198148    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51382
	I0731 09:58:26.198459    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:26.198799    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:26.198815    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:26.199012    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:26.199114    3166 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:58:26.199193    3166 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:58:26.199261    3166 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:58:26.200243    3166 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:58:26.200251    3166 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:58:26.200499    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:26.200522    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:26.208941    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51384
	I0731 09:58:26.209272    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:26.209596    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:26.209612    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:26.209789    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:26.209899    3166 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:58:26.209975    3166 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:58:26.210219    3166 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:58:26.210239    3166 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:58:26.218735    3166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51386
	I0731 09:58:26.219071    3166 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:58:26.219422    3166 main.go:141] libmachine: Using API Version  1
	I0731 09:58:26.219440    3166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:58:26.219641    3166 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:58:26.219744    3166 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:58:26.219870    3166 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:58:26.219881    3166 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:58:26.219952    3166 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:58:26.220047    3166 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:58:26.220133    3166 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:58:26.220209    3166 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:58:26.251867    3166 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:58:26.262336    3166 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:381: status says not three kubelets are running: args "out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr": ha-393000
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured

                                                
                                                
ha-393000-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m03
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured

                                                
                                                
ha-393000-m04
type: Worker
host: Running
kubelet: Stopped

                                                
                                                
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:244: <<< TestMultiControlPlane/serial/StopSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/StopSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
E0731 09:58:27.205581    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (2.45031017s)
helpers_test.go:252: TestMultiControlPlane/serial/StopSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| image   | functional-680000 image ls           | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:52 PDT | 31 Jul 24 09:52 PDT |
	| delete  | -p functional-680000                 | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:53 PDT | 31 Jul 24 09:53 PDT |
	| start   | -p ha-393000 --wait=true             | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:53 PDT | 31 Jul 24 09:56 PDT |
	|         | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|         | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|         | --driver=hyperkit                    |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- apply -f             | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- rollout status       | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 09:53:16
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 09:53:16.140722    2954 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:53:16.140891    2954 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:53:16.140897    2954 out.go:304] Setting ErrFile to fd 2...
	I0731 09:53:16.140901    2954 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:53:16.141085    2954 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:53:16.142669    2954 out.go:298] Setting JSON to false
	I0731 09:53:16.166361    2954 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1366,"bootTime":1722443430,"procs":467,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:53:16.166460    2954 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:53:16.192371    2954 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 09:53:16.233499    2954 notify.go:220] Checking for updates...
	I0731 09:53:16.263444    2954 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 09:53:16.328756    2954 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:53:16.398694    2954 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:53:16.420465    2954 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:53:16.443406    2954 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:53:16.464565    2954 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 09:53:16.486871    2954 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:53:16.517461    2954 out.go:177] * Using the hyperkit driver based on user configuration
	I0731 09:53:16.559490    2954 start.go:297] selected driver: hyperkit
	I0731 09:53:16.559519    2954 start.go:901] validating driver "hyperkit" against <nil>
	I0731 09:53:16.559538    2954 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 09:53:16.563960    2954 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:53:16.564071    2954 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 09:53:16.572413    2954 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 09:53:16.576399    2954 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:53:16.576420    2954 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 09:53:16.576454    2954 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 09:53:16.576646    2954 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:53:16.576708    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:16.576719    2954 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0731 09:53:16.576725    2954 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0731 09:53:16.576791    2954 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docke
r CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0
GPUs: AutoPauseInterval:1m0s}
	I0731 09:53:16.576877    2954 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:53:16.619419    2954 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 09:53:16.640390    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:53:16.640480    2954 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 09:53:16.640509    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:53:16.640712    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:53:16.640731    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:53:16.641227    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:53:16.641275    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json: {Name:mka52f595799559e261228b691f11b60413ee780 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:16.641876    2954 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:53:16.641986    2954 start.go:364] duration metric: took 90.888µs to acquireMachinesLock for "ha-393000"
	I0731 09:53:16.642025    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType
:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:53:16.642108    2954 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 09:53:16.663233    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:53:16.663389    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:53:16.663426    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:53:16.672199    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51037
	I0731 09:53:16.672559    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:53:16.672976    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:53:16.672987    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:53:16.673241    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:53:16.673369    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:16.673473    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:16.673584    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:53:16.673605    2954 client.go:168] LocalClient.Create starting
	I0731 09:53:16.673642    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:53:16.673693    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:53:16.673710    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:53:16.673763    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:53:16.673801    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:53:16.673815    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:53:16.673840    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:53:16.673850    2954 main.go:141] libmachine: (ha-393000) Calling .PreCreateCheck
	I0731 09:53:16.673929    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:16.674073    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:16.684622    2954 main.go:141] libmachine: Creating machine...
	I0731 09:53:16.684647    2954 main.go:141] libmachine: (ha-393000) Calling .Create
	I0731 09:53:16.684806    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:16.685170    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.684943    2962 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:53:16.685305    2954 main.go:141] libmachine: (ha-393000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:53:16.866642    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.866533    2962 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa...
	I0731 09:53:16.907777    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.907707    2962 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk...
	I0731 09:53:16.907795    2954 main.go:141] libmachine: (ha-393000) DBG | Writing magic tar header
	I0731 09:53:16.907815    2954 main.go:141] libmachine: (ha-393000) DBG | Writing SSH key tar header
	I0731 09:53:16.908296    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.908249    2962 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000 ...
	I0731 09:53:17.278530    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:17.278549    2954 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 09:53:17.278657    2954 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 09:53:17.388690    2954 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 09:53:17.388709    2954 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:53:17.388758    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:53:17.388793    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:53:17.388830    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:53:17.388871    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:53:17.388884    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:53:17.391787    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Pid is 2965
	I0731 09:53:17.392177    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 09:53:17.392188    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:17.392264    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:17.393257    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:17.393317    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:17.393342    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:17.393359    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:17.393369    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:17.399449    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:53:17.451566    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:53:17.452146    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:53:17.452168    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:53:17.452176    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:53:17.452184    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:53:17.832667    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:53:17.832680    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:53:17.947165    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:53:17.947181    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:53:17.947203    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:53:17.947214    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:53:17.948083    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:53:17.948094    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:53:19.393474    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 1
	I0731 09:53:19.393491    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:19.393544    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:19.394408    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:19.394431    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:19.394439    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:19.394449    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:19.394461    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:21.396273    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 2
	I0731 09:53:21.396290    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:21.396404    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:21.397210    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:21.397262    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:21.397275    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:21.397283    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:21.397292    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:23.397619    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 3
	I0731 09:53:23.397635    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:23.397733    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:23.398576    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:23.398585    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:23.398595    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:23.398604    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:23.398623    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:23.511265    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 09:53:23.511317    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 09:53:23.511327    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 09:53:23.534471    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 09:53:25.399722    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 4
	I0731 09:53:25.399735    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:25.399799    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:25.400596    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:25.400655    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:25.400665    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:25.400672    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:25.400681    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:27.400848    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 5
	I0731 09:53:27.400872    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:27.400976    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:27.401778    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:27.401824    2954 main.go:141] libmachine: (ha-393000) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:53:27.401836    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:53:27.401845    2954 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 09:53:27.401856    2954 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 09:53:27.401921    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:27.402530    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:27.402623    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:27.402706    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:53:27.402714    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:53:27.402795    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:27.402846    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:27.403621    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:53:27.403635    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:53:27.403641    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:53:27.403647    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:27.403727    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:27.403804    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:27.403889    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:27.403968    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:27.404083    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:27.404258    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:27.404265    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:53:28.471124    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:53:28.471139    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:53:28.471151    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.471303    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.471413    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.471516    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.471604    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.471751    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.471894    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.471902    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:53:28.534700    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:53:28.534755    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:53:28.534761    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:53:28.534766    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.534914    2954 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 09:53:28.534924    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.535023    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.535122    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.535205    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.535305    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.535404    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.535525    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.535678    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.535686    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 09:53:28.612223    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 09:53:28.612243    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.612383    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.612495    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.612585    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.612692    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.612835    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.612989    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.613000    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:53:28.684692    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:53:28.684711    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:53:28.684731    2954 buildroot.go:174] setting up certificates
	I0731 09:53:28.684742    2954 provision.go:84] configureAuth start
	I0731 09:53:28.684753    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.684892    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:28.684986    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.685097    2954 provision.go:143] copyHostCerts
	I0731 09:53:28.685132    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:53:28.685202    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:53:28.685210    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:53:28.685348    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:53:28.685544    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:53:28.685575    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:53:28.685580    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:53:28.685671    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:53:28.685817    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:53:28.685858    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:53:28.685863    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:53:28.685947    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:53:28.686099    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 09:53:28.975770    2954 provision.go:177] copyRemoteCerts
	I0731 09:53:28.975860    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:53:28.975879    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.976044    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.976151    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.976253    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.976368    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:29.014295    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:53:29.014364    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0731 09:53:29.033836    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:53:29.033901    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:53:29.053674    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:53:29.053744    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:53:29.073245    2954 provision.go:87] duration metric: took 388.494938ms to configureAuth
	I0731 09:53:29.073258    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:53:29.073388    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:53:29.073402    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:29.073538    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.073618    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.073712    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.073794    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.073871    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.073977    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.074114    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.074121    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:53:29.138646    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:53:29.138660    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:53:29.138727    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:53:29.138739    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.138887    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.138979    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.139070    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.139173    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.139333    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.139499    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.139544    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:53:29.214149    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:53:29.214180    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.214320    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.214403    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.214495    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.214599    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.214718    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.214856    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.214868    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:53:30.823417    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:53:30.823433    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:53:30.823439    2954 main.go:141] libmachine: (ha-393000) Calling .GetURL
	I0731 09:53:30.823574    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:53:30.823582    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:53:30.823587    2954 client.go:171] duration metric: took 14.150104113s to LocalClient.Create
	I0731 09:53:30.823598    2954 start.go:167] duration metric: took 14.150148374s to libmachine.API.Create "ha-393000"
	I0731 09:53:30.823607    2954 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 09:53:30.823621    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:53:30.823633    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.823781    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:53:30.823793    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.823880    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.823974    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.824065    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.824160    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:30.868545    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:53:30.872572    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:53:30.872587    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:53:30.872696    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:53:30.872889    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:53:30.872896    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:53:30.873123    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:53:30.890087    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:53:30.911977    2954 start.go:296] duration metric: took 88.361428ms for postStartSetup
	I0731 09:53:30.912003    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:30.912600    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:30.912759    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:53:30.913103    2954 start.go:128] duration metric: took 14.271109881s to createHost
	I0731 09:53:30.913117    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.913201    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.913305    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.913399    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.913473    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.913588    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:30.913703    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:30.913711    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:53:30.978737    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444810.120322538
	
	I0731 09:53:30.978750    2954 fix.go:216] guest clock: 1722444810.120322538
	I0731 09:53:30.978755    2954 fix.go:229] Guest: 2024-07-31 09:53:30.120322538 -0700 PDT Remote: 2024-07-31 09:53:30.913111 -0700 PDT m=+14.813015151 (delta=-792.788462ms)
	I0731 09:53:30.978778    2954 fix.go:200] guest clock delta is within tolerance: -792.788462ms
	I0731 09:53:30.978783    2954 start.go:83] releasing machines lock for "ha-393000", held for 14.336915594s
	I0731 09:53:30.978805    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.978937    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:30.979046    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979390    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979496    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979591    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:53:30.979625    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.979645    2954 ssh_runner.go:195] Run: cat /version.json
	I0731 09:53:30.979655    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.979750    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.979786    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.979846    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.979902    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.979927    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.979985    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.980003    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:30.980063    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:31.061693    2954 ssh_runner.go:195] Run: systemctl --version
	I0731 09:53:31.066472    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 09:53:31.070647    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:53:31.070687    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:53:31.084420    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:53:31.084432    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:53:31.084539    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:53:31.099368    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:53:31.108753    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:53:31.117896    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:53:31.117944    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:53:31.126974    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:53:31.135823    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:53:31.144673    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:53:31.153676    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:53:31.162890    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:53:31.171995    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:53:31.181357    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:53:31.190300    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:53:31.198317    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:53:31.206286    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:31.306658    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:53:31.325552    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:53:31.325643    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:53:31.346571    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:53:31.359753    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:53:31.393299    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:53:31.404448    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:53:31.414860    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:53:31.437636    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:53:31.448198    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:53:31.464071    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:53:31.467113    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:53:31.474646    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:53:31.488912    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:53:31.589512    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:53:31.693775    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:53:31.693845    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:53:31.709549    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:31.811094    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:53:34.149023    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.337932224s)
	I0731 09:53:34.149088    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:53:34.161198    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:53:34.175766    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:53:34.187797    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:53:34.283151    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:53:34.377189    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:34.469067    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:53:34.482248    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:53:34.492385    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:34.587912    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:53:34.647834    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:53:34.647904    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:53:34.652204    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:53:34.652250    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:53:34.655108    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:53:34.680326    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:53:34.680403    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:53:34.699387    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:53:34.764313    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:53:34.764369    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:34.764763    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:53:34.769523    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:53:34.780319    2954 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Moun
tType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 09:53:34.780379    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:53:34.780438    2954 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 09:53:34.792271    2954 docker.go:685] Got preloaded images: 
	I0731 09:53:34.792283    2954 docker.go:691] registry.k8s.io/kube-apiserver:v1.30.3 wasn't preloaded
	I0731 09:53:34.792332    2954 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0731 09:53:34.800298    2954 ssh_runner.go:195] Run: which lz4
	I0731 09:53:34.803039    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0731 09:53:34.803157    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0731 09:53:34.806121    2954 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0731 09:53:34.806135    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (359612007 bytes)
	I0731 09:53:35.858525    2954 docker.go:649] duration metric: took 1.055419334s to copy over tarball
	I0731 09:53:35.858591    2954 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0731 09:53:38.196952    2954 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.338365795s)
	I0731 09:53:38.196967    2954 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0731 09:53:38.223533    2954 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0731 09:53:38.232307    2954 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2630 bytes)
	I0731 09:53:38.245888    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:38.355987    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:53:40.705059    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.349073816s)
	I0731 09:53:40.705149    2954 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 09:53:40.718481    2954 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0731 09:53:40.718506    2954 cache_images.go:84] Images are preloaded, skipping loading
	I0731 09:53:40.718529    2954 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 09:53:40.718621    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:53:40.718689    2954 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 09:53:40.756905    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:40.756918    2954 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0731 09:53:40.756931    2954 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 09:53:40.756946    2954 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 09:53:40.757028    2954 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 09:53:40.757045    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:53:40.757094    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:53:40.770142    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:53:40.770212    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:53:40.770264    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:53:40.778467    2954 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 09:53:40.778510    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 09:53:40.786404    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 09:53:40.799629    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:53:40.814270    2954 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 09:53:40.827819    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1446 bytes)
	I0731 09:53:40.841352    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:53:40.844280    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:53:40.854288    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:40.961875    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:53:40.976988    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 09:53:40.977000    2954 certs.go:194] generating shared ca certs ...
	I0731 09:53:40.977011    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:40.977205    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:53:40.977278    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:53:40.977287    2954 certs.go:256] generating profile certs ...
	I0731 09:53:40.977331    2954 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:53:40.977344    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt with IP's: []
	I0731 09:53:41.064733    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt ...
	I0731 09:53:41.064749    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt: {Name:mk11f8b5ec16b878c9f692ccaff9a489ecc76fb2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.065074    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key ...
	I0731 09:53:41.065082    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key: {Name:mk18e6554cf3c807804faf77a7a9620e92860212 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.065322    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9
	I0731 09:53:41.065337    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.254]
	I0731 09:53:41.267360    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 ...
	I0731 09:53:41.267375    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9: {Name:mk9c13a9d071c94395118e1f00f992954683ef5b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.267745    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9 ...
	I0731 09:53:41.267755    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9: {Name:mk49f9f4ab2c1350a3cdb49ded7d6cffd5f069e5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.267965    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:53:41.268145    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:53:41.268307    2954 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:53:41.268320    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt with IP's: []
	I0731 09:53:41.352486    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt ...
	I0731 09:53:41.352499    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt: {Name:mk6759a3c690d7a9e990f65c338d22538c5b127a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.352775    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key ...
	I0731 09:53:41.352788    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key: {Name:mk4f661b46725a943b9862deb5f02f250855a1b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.352992    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:53:41.353021    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:53:41.353040    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:53:41.353059    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:53:41.353078    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:53:41.353096    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:53:41.353115    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:53:41.353132    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:53:41.353229    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:53:41.353280    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:53:41.353289    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:53:41.353319    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:53:41.353348    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:53:41.353377    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:53:41.353444    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:53:41.353475    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.353494    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.353511    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.353950    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:53:41.373611    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:53:41.392573    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:53:41.412520    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:53:41.433349    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0731 09:53:41.452365    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0731 09:53:41.472032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:53:41.491092    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:53:41.510282    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:53:41.529242    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:53:41.549127    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:53:41.568112    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 09:53:41.581548    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:53:41.585729    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:53:41.594979    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.598924    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.598977    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.603300    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:53:41.612561    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:53:41.621665    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.624970    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.625005    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.629117    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:53:41.638283    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:53:41.647422    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.650741    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.650776    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.654995    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:53:41.664976    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:53:41.668030    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:53:41.668072    2954 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTy
pe:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:53:41.668156    2954 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 09:53:41.680752    2954 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 09:53:41.691788    2954 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0731 09:53:41.701427    2954 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0731 09:53:41.710462    2954 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0731 09:53:41.710473    2954 kubeadm.go:157] found existing configuration files:
	
	I0731 09:53:41.710522    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0731 09:53:41.718051    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0731 09:53:41.718109    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0731 09:53:41.726696    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0731 09:53:41.737698    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0731 09:53:41.737751    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0731 09:53:41.745907    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0731 09:53:41.753641    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0731 09:53:41.753680    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0731 09:53:41.761450    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0731 09:53:41.769156    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0731 09:53:41.769207    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0731 09:53:41.777068    2954 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0731 09:53:41.848511    2954 kubeadm.go:310] [init] Using Kubernetes version: v1.30.3
	I0731 09:53:41.848564    2954 kubeadm.go:310] [preflight] Running pre-flight checks
	I0731 09:53:41.937481    2954 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0731 09:53:41.937568    2954 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0731 09:53:41.937658    2954 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0731 09:53:42.093209    2954 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0731 09:53:42.137661    2954 out.go:204]   - Generating certificates and keys ...
	I0731 09:53:42.137715    2954 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0731 09:53:42.137758    2954 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0731 09:53:42.784132    2954 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0731 09:53:42.954915    2954 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0731 09:53:43.064099    2954 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0731 09:53:43.107145    2954 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0731 09:53:43.256550    2954 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0731 09:53:43.256643    2954 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-393000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0731 09:53:43.365808    2954 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0731 09:53:43.365910    2954 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-393000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0731 09:53:43.496987    2954 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0731 09:53:43.811530    2954 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0731 09:53:43.998883    2954 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0731 09:53:43.999156    2954 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0731 09:53:44.246352    2954 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0731 09:53:44.460463    2954 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0731 09:53:44.552236    2954 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0731 09:53:44.656335    2954 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0731 09:53:44.920852    2954 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0731 09:53:44.921188    2954 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0731 09:53:44.922677    2954 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0731 09:53:44.944393    2954 out.go:204]   - Booting up control plane ...
	I0731 09:53:44.944462    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0731 09:53:44.944530    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0731 09:53:44.944583    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0731 09:53:44.944663    2954 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0731 09:53:44.944728    2954 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0731 09:53:44.944759    2954 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0731 09:53:45.048317    2954 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0731 09:53:45.048393    2954 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0731 09:53:45.548165    2954 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 500.802272ms
	I0731 09:53:45.548224    2954 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0731 09:53:51.610602    2954 kubeadm.go:310] [api-check] The API server is healthy after 6.066816222s
	I0731 09:53:51.618854    2954 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0731 09:53:51.625868    2954 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0731 09:53:51.637830    2954 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0731 09:53:51.637998    2954 kubeadm.go:310] [mark-control-plane] Marking the node ha-393000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0731 09:53:51.650953    2954 kubeadm.go:310] [bootstrap-token] Using token: wt4o9v.66pnb4w7anxpqs79
	I0731 09:53:51.687406    2954 out.go:204]   - Configuring RBAC rules ...
	I0731 09:53:51.687587    2954 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0731 09:53:51.690002    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0731 09:53:51.716618    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0731 09:53:51.718333    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0731 09:53:51.720211    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0731 09:53:51.722003    2954 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0731 09:53:52.016537    2954 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0731 09:53:52.431449    2954 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0731 09:53:53.015675    2954 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0731 09:53:53.016431    2954 kubeadm.go:310] 
	I0731 09:53:53.016524    2954 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0731 09:53:53.016539    2954 kubeadm.go:310] 
	I0731 09:53:53.016612    2954 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0731 09:53:53.016623    2954 kubeadm.go:310] 
	I0731 09:53:53.016649    2954 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0731 09:53:53.016721    2954 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0731 09:53:53.016763    2954 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0731 09:53:53.016773    2954 kubeadm.go:310] 
	I0731 09:53:53.016814    2954 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0731 09:53:53.016821    2954 kubeadm.go:310] 
	I0731 09:53:53.016868    2954 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0731 09:53:53.016891    2954 kubeadm.go:310] 
	I0731 09:53:53.016935    2954 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0731 09:53:53.017005    2954 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0731 09:53:53.017059    2954 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0731 09:53:53.017072    2954 kubeadm.go:310] 
	I0731 09:53:53.017139    2954 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0731 09:53:53.017203    2954 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0731 09:53:53.017207    2954 kubeadm.go:310] 
	I0731 09:53:53.017269    2954 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token wt4o9v.66pnb4w7anxpqs79 \
	I0731 09:53:53.017353    2954 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 \
	I0731 09:53:53.017373    2954 kubeadm.go:310] 	--control-plane 
	I0731 09:53:53.017381    2954 kubeadm.go:310] 
	I0731 09:53:53.017452    2954 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0731 09:53:53.017461    2954 kubeadm.go:310] 
	I0731 09:53:53.017528    2954 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token wt4o9v.66pnb4w7anxpqs79 \
	I0731 09:53:53.017610    2954 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 
	I0731 09:53:53.018224    2954 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0731 09:53:53.018239    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:53.018245    2954 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0731 09:53:53.040097    2954 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0731 09:53:53.097376    2954 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0731 09:53:53.101992    2954 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.3/kubectl ...
	I0731 09:53:53.102004    2954 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0731 09:53:53.115926    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0731 09:53:53.335699    2954 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0731 09:53:53.335768    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:53.335769    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000 minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=true
	I0731 09:53:53.489955    2954 ops.go:34] apiserver oom_adj: -16
	I0731 09:53:53.490022    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:53.990085    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:54.490335    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:54.991422    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:55.490608    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:55.990200    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:56.490175    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:56.990807    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:57.491373    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:57.991164    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:58.491587    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:58.990197    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:59.490119    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:59.990444    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:00.490776    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:00.990123    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:01.490685    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:01.991905    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:02.490505    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:02.990148    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:03.490590    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:03.990745    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:04.491071    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:04.991117    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:05.490027    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:05.576301    2954 kubeadm.go:1113] duration metric: took 12.240698872s to wait for elevateKubeSystemPrivileges
	I0731 09:54:05.576324    2954 kubeadm.go:394] duration metric: took 23.908471214s to StartCluster
	I0731 09:54:05.576346    2954 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:05.576441    2954 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:54:05.576993    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:05.577274    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0731 09:54:05.577286    2954 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:05.577302    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:54:05.577319    2954 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 09:54:05.577357    2954 addons.go:69] Setting storage-provisioner=true in profile "ha-393000"
	I0731 09:54:05.577363    2954 addons.go:69] Setting default-storageclass=true in profile "ha-393000"
	I0731 09:54:05.577386    2954 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-393000"
	I0731 09:54:05.577386    2954 addons.go:234] Setting addon storage-provisioner=true in "ha-393000"
	I0731 09:54:05.577408    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:05.577423    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:05.577661    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.577669    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.577675    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.577679    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.587150    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51060
	I0731 09:54:05.587233    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51061
	I0731 09:54:05.587573    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.587584    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.587918    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.587919    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.587930    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.587931    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.588210    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.588232    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.588358    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.588454    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.588531    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.588614    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.588639    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.590714    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:54:05.590994    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 09:54:05.591385    2954 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 09:54:05.591537    2954 addons.go:234] Setting addon default-storageclass=true in "ha-393000"
	I0731 09:54:05.591560    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:05.591783    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.591798    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.597469    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51064
	I0731 09:54:05.597830    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.598161    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.598171    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.598405    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.598520    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.598612    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.598688    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.599681    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:05.600339    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51066
	I0731 09:54:05.600677    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.601035    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.601051    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.601254    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.601611    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.601637    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.610207    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51068
	I0731 09:54:05.610548    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.610892    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.610909    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.611149    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.611266    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.611351    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.611421    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.612421    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:05.612552    2954 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0731 09:54:05.612560    2954 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0731 09:54:05.612568    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:05.612695    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:05.612786    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:05.612891    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:05.612974    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:05.623428    2954 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0731 09:54:05.644440    2954 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0731 09:54:05.644452    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0731 09:54:05.644468    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:05.644630    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:05.644723    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:05.644822    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:05.644921    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:05.653382    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0731 09:54:05.687318    2954 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0731 09:54:05.764200    2954 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0731 09:54:06.182319    2954 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0731 09:54:06.182364    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.182377    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.182560    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.182561    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.182572    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.182582    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.182588    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.182708    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.182715    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.182734    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.182830    2954 round_trippers.go:463] GET https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0731 09:54:06.182842    2954 round_trippers.go:469] Request Headers:
	I0731 09:54:06.182849    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:54:06.182854    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:54:06.189976    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:54:06.190422    2954 round_trippers.go:463] PUT https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0731 09:54:06.190430    2954 round_trippers.go:469] Request Headers:
	I0731 09:54:06.190435    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:54:06.190439    2954 round_trippers.go:473]     Content-Type: application/json
	I0731 09:54:06.190441    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:54:06.192143    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:54:06.192277    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.192285    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.192466    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.192478    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.192482    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318368    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.318380    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.318552    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.318557    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318564    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.318573    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.318591    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.318752    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318752    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.318769    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.354999    2954 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0731 09:54:06.412621    2954 addons.go:510] duration metric: took 835.314471ms for enable addons: enabled=[default-storageclass storage-provisioner]
	I0731 09:54:06.412653    2954 start.go:246] waiting for cluster config update ...
	I0731 09:54:06.412665    2954 start.go:255] writing updated cluster config ...
	I0731 09:54:06.449784    2954 out.go:177] 
	I0731 09:54:06.487284    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:06.487391    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:06.509688    2954 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 09:54:06.585678    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:54:06.585712    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:54:06.585911    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:54:06.585931    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:54:06.586023    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:06.586742    2954 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:54:06.586867    2954 start.go:364] duration metric: took 101.68µs to acquireMachinesLock for "ha-393000-m02"
	I0731 09:54:06.586897    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks
:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:06.586986    2954 start.go:125] createHost starting for "m02" (driver="hyperkit")
	I0731 09:54:06.608709    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:54:06.608788    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:06.608805    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:06.617299    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51073
	I0731 09:54:06.617638    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:06.618011    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:06.618029    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:06.618237    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:06.618326    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:06.618405    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:06.618514    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:54:06.618528    2954 client.go:168] LocalClient.Create starting
	I0731 09:54:06.618559    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:54:06.618609    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:54:06.618620    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:54:06.618668    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:54:06.618707    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:54:06.618717    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:54:06.618731    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:54:06.618737    2954 main.go:141] libmachine: (ha-393000-m02) Calling .PreCreateCheck
	I0731 09:54:06.618808    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:06.618841    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:06.646223    2954 main.go:141] libmachine: Creating machine...
	I0731 09:54:06.646236    2954 main.go:141] libmachine: (ha-393000-m02) Calling .Create
	I0731 09:54:06.646361    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:06.646520    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.646351    2979 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:54:06.646597    2954 main.go:141] libmachine: (ha-393000-m02) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:54:06.831715    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.831641    2979 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa...
	I0731 09:54:06.939142    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.939044    2979 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk...
	I0731 09:54:06.939162    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Writing magic tar header
	I0731 09:54:06.939170    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Writing SSH key tar header
	I0731 09:54:06.940042    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.939949    2979 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02 ...
	I0731 09:54:07.311809    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:07.311824    2954 main.go:141] libmachine: (ha-393000-m02) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 09:54:07.311866    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 09:54:07.337818    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 09:54:07.337835    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:54:07.337884    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:54:07.337912    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:54:07.337954    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:54:07.337986    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:54:07.338000    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:54:07.340860    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Pid is 2980
	I0731 09:54:07.341360    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 09:54:07.341374    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:07.341426    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:07.342343    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:07.342405    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:07.342418    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:07.342433    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:07.342443    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:07.342451    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:07.348297    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:54:07.357913    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:54:07.358688    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:54:07.358712    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:54:07.358723    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:54:07.358740    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:54:07.743017    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:54:07.743035    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:54:07.858034    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:54:07.858062    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:54:07.858072    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:54:07.858084    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:54:07.858884    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:54:07.858896    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:54:09.343775    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 1
	I0731 09:54:09.343792    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:09.343900    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:09.344720    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:09.344781    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:09.344792    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:09.344804    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:09.344817    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:09.344826    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:11.346829    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 2
	I0731 09:54:11.346846    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:11.346940    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:11.347752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:11.347766    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:11.347784    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:11.347795    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:11.347819    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:11.347832    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:13.348981    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 3
	I0731 09:54:13.349001    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:13.349109    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:13.349907    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:13.349943    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:13.349954    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:13.349965    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:13.349972    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:13.349980    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:13.459282    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:54:13.459342    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:54:13.459355    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:54:13.483197    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:54:15.351752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 4
	I0731 09:54:15.351769    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:15.351820    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:15.352675    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:15.352721    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:15.352735    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:15.352744    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:15.352752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:15.352760    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:17.353423    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 5
	I0731 09:54:17.353439    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:17.353530    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:17.354334    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:17.354363    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:54:17.354369    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:54:17.354392    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 09:54:17.354398    2954 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 09:54:17.354469    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:17.355226    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:17.355356    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:17.355457    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:54:17.355466    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:54:17.355564    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:17.355626    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:17.356407    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:54:17.356415    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:54:17.356426    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:54:17.356432    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:17.356529    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:17.356628    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:17.356727    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:17.356823    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:17.356939    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:17.357111    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:17.357118    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:54:18.376907    2954 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0731 09:54:21.440008    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:54:21.440021    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:54:21.440026    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.440157    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.440265    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.440360    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.440445    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.440567    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.440720    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.440728    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:54:21.502840    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:54:21.502894    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:54:21.502900    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:54:21.502905    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.503041    2954 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 09:54:21.503052    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.503150    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.503242    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.503322    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.503392    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.503473    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.503584    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.503728    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.503737    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 09:54:21.579730    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 09:54:21.579745    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.579874    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.579976    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.580070    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.580163    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.580287    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.580427    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.580439    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:54:21.651021    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:54:21.651038    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:54:21.651048    2954 buildroot.go:174] setting up certificates
	I0731 09:54:21.651054    2954 provision.go:84] configureAuth start
	I0731 09:54:21.651061    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.651192    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:21.651290    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.651382    2954 provision.go:143] copyHostCerts
	I0731 09:54:21.651408    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:54:21.651454    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:54:21.651459    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:54:21.651611    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:54:21.651812    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:54:21.651848    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:54:21.651853    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:54:21.651933    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:54:21.652069    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:54:21.652109    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:54:21.652114    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:54:21.652196    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:54:21.652337    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 09:54:21.695144    2954 provision.go:177] copyRemoteCerts
	I0731 09:54:21.695204    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:54:21.695225    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.695363    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.695457    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.695544    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.695616    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:21.734262    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:54:21.734338    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:54:21.760893    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:54:21.760979    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 09:54:21.787062    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:54:21.787131    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:54:21.807971    2954 provision.go:87] duration metric: took 156.910143ms to configureAuth
	I0731 09:54:21.807985    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:54:21.808123    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:21.808137    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:21.808270    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.808350    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.808427    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.808504    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.808592    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.808693    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.808822    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.808830    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:54:21.871923    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:54:21.871936    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:54:21.872014    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:54:21.872025    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.872159    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.872242    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.872339    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.872432    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.872558    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.872693    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.872741    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:54:21.947253    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:54:21.947272    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.947434    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.947533    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.947607    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.947689    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.947845    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.947990    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.948005    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:54:23.521299    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:54:23.521320    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:54:23.521327    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetURL
	I0731 09:54:23.521467    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:54:23.521475    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:54:23.521480    2954 client.go:171] duration metric: took 16.903099578s to LocalClient.Create
	I0731 09:54:23.521492    2954 start.go:167] duration metric: took 16.903132869s to libmachine.API.Create "ha-393000"
	I0731 09:54:23.521498    2954 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 09:54:23.521504    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:54:23.521519    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.521663    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:54:23.521677    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.521769    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.521859    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.521933    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.522032    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:23.560604    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:54:23.563782    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:54:23.563793    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:54:23.563892    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:54:23.564080    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:54:23.564086    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:54:23.564293    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:54:23.571517    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:54:23.591429    2954 start.go:296] duration metric: took 69.922656ms for postStartSetup
	I0731 09:54:23.591460    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:23.592068    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:23.592212    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:23.592596    2954 start.go:128] duration metric: took 17.005735325s to createHost
	I0731 09:54:23.592609    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.592713    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.592826    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.592928    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.593022    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.593148    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:23.593279    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:23.593287    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:54:23.656618    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444863.810880618
	
	I0731 09:54:23.656630    2954 fix.go:216] guest clock: 1722444863.810880618
	I0731 09:54:23.656635    2954 fix.go:229] Guest: 2024-07-31 09:54:23.810880618 -0700 PDT Remote: 2024-07-31 09:54:23.592602 -0700 PDT m=+67.492982270 (delta=218.278618ms)
	I0731 09:54:23.656654    2954 fix.go:200] guest clock delta is within tolerance: 218.278618ms
	I0731 09:54:23.656663    2954 start.go:83] releasing machines lock for "ha-393000-m02", held for 17.069938552s
	I0731 09:54:23.656681    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.656811    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:23.684522    2954 out.go:177] * Found network options:
	I0731 09:54:23.836571    2954 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 09:54:23.866932    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:54:23.866975    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.867861    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.868089    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.868209    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:54:23.868288    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 09:54:23.868332    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:54:23.868439    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 09:54:23.868462    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.868525    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.868708    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.868756    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.868922    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.868944    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.869058    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:23.869081    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.869206    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 09:54:23.904135    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:54:23.904205    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:54:23.927324    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:54:23.927338    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:54:23.927400    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:54:23.970222    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:54:23.978777    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:54:23.987481    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:54:23.987533    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:54:23.996430    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:54:24.004692    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:54:24.012968    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:54:24.021204    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:54:24.030482    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:54:24.038802    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:54:24.047006    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:54:24.055781    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:54:24.063050    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:54:24.072089    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:24.169406    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:54:24.189452    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:54:24.189519    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:54:24.202393    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:54:24.214821    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:54:24.229583    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:54:24.240171    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:54:24.250428    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:54:24.302946    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:54:24.313120    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:54:24.327912    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:54:24.331673    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:54:24.338902    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:54:24.352339    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:54:24.449032    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:54:24.557842    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:54:24.557870    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:54:24.571700    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:24.673137    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:54:27.047079    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.373944592s)
	I0731 09:54:27.047137    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:54:27.057410    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:54:27.071816    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:54:27.082278    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:54:27.176448    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:54:27.277016    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:27.384870    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:54:27.398860    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:54:27.409735    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:27.507837    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:54:27.568313    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:54:27.568381    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:54:27.573262    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:54:27.573320    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:54:27.579109    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:54:27.606116    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:54:27.606208    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:54:27.625621    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:54:27.663443    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:54:27.704938    2954 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 09:54:27.726212    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:27.726560    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:54:27.730336    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:54:27.740553    2954 mustload.go:65] Loading cluster: ha-393000
	I0731 09:54:27.740700    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:27.740921    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:27.740943    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:27.749667    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51097
	I0731 09:54:27.750028    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:27.750384    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:27.750401    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:27.750596    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:27.750732    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:27.750813    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:27.750888    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:27.751853    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:27.752094    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:27.752117    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:27.760565    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51099
	I0731 09:54:27.760882    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:27.761210    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:27.761223    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:27.761435    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:27.761551    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:27.761648    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 09:54:27.761653    2954 certs.go:194] generating shared ca certs ...
	I0731 09:54:27.761672    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.761836    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:54:27.761936    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:54:27.761945    2954 certs.go:256] generating profile certs ...
	I0731 09:54:27.762034    2954 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:54:27.762058    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069
	I0731 09:54:27.762073    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0731 09:54:27.834156    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 ...
	I0731 09:54:27.834169    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069: {Name:mk0062f228b9fa8374eba60d674a49cb0265b988 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.834495    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069 ...
	I0731 09:54:27.834504    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069: {Name:mkd62a5cca652a59908630fd95f20d2e01386237 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.834713    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:54:27.834929    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:54:27.835197    2954 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:54:27.835206    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:54:27.835229    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:54:27.835247    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:54:27.835267    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:54:27.835284    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:54:27.835302    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:54:27.835321    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:54:27.835338    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:54:27.835425    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:54:27.835473    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:54:27.835481    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:54:27.835511    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:54:27.835539    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:54:27.835575    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:54:27.835647    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:54:27.835682    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:54:27.835703    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:54:27.835723    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:27.835762    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:27.835910    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:27.836005    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:27.836102    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:27.836203    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:27.868754    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 09:54:27.872390    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 09:54:27.881305    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 09:54:27.884697    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 09:54:27.893772    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 09:54:27.896980    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 09:54:27.905593    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 09:54:27.908812    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 09:54:27.916605    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 09:54:27.919921    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 09:54:27.927985    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 09:54:27.931223    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 09:54:27.940238    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:54:27.960044    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:54:27.980032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:54:27.999204    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:54:28.018549    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0731 09:54:28.037848    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 09:54:28.057376    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:54:28.076776    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:54:28.096215    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:54:28.115885    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:54:28.135490    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:54:28.154907    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 09:54:28.169275    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 09:54:28.183001    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 09:54:28.196610    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 09:54:28.210320    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 09:54:28.223811    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 09:54:28.237999    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 09:54:28.251767    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:54:28.256201    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:54:28.265361    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.268834    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.268882    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.273194    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:54:28.282819    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:54:28.292122    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.295585    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.295622    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.299894    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:54:28.308965    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:54:28.318848    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.322347    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.322383    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.326657    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:54:28.335765    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:54:28.338885    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:54:28.338923    2954 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 09:54:28.338981    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:54:28.338998    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:54:28.339031    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:54:28.352962    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:54:28.353010    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:54:28.353068    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:54:28.361447    2954 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 09:54:28.361501    2954 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 09:54:28.370031    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet
	I0731 09:54:28.370031    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm
	I0731 09:54:28.370036    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl
	I0731 09:54:31.406224    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:54:31.406308    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:54:31.409804    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 09:54:31.409825    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 09:54:32.215163    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:54:32.215265    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:54:32.218832    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 09:54:32.218858    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 09:54:39.678084    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:54:39.690174    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:54:39.690295    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:54:39.693595    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 09:54:39.693614    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 09:54:39.964594    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 09:54:39.972786    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 09:54:39.986436    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:54:39.999856    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 09:54:40.013590    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:54:40.016608    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:54:40.026617    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:40.125738    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:54:40.142197    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:40.142482    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:40.142512    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:40.151352    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51126
	I0731 09:54:40.151710    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:40.152074    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:40.152091    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:40.152318    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:40.152428    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:40.152528    2954 start.go:317] joinCluster: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 Clu
sterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpira
tion:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:54:40.152603    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0731 09:54:40.152616    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:40.152722    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:40.152805    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:40.152933    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:40.153036    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:40.232831    2954 start.go:343] trying to join control-plane node "m02" to cluster: &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:40.232861    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token heh6bo.7n85cftszx0hevpy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443"
	I0731 09:55:07.963279    2954 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token heh6bo.7n85cftszx0hevpy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443": (27.730638671s)
	I0731 09:55:07.963316    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0731 09:55:08.368958    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000-m02 minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=false
	I0731 09:55:08.452570    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-393000-m02 node-role.kubernetes.io/control-plane:NoSchedule-
	I0731 09:55:08.540019    2954 start.go:319] duration metric: took 28.387749448s to joinCluster
	I0731 09:55:08.540065    2954 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:08.540296    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:08.563232    2954 out.go:177] * Verifying Kubernetes components...
	I0731 09:55:08.603726    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:08.841318    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:55:08.872308    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:55:08.872512    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 09:55:08.872555    2954 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 09:55:08.872732    2954 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 09:55:08.872795    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:08.872800    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:08.872806    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:08.872810    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:08.881842    2954 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 09:55:09.372875    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:09.372888    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:09.372894    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:09.372897    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:09.374975    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:09.872917    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:09.872929    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:09.872935    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:09.872939    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:09.875869    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.372943    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:10.372956    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:10.372964    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:10.372967    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:10.375041    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.874945    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:10.875020    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:10.875035    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:10.875043    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:10.877858    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.878307    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:11.373440    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:11.373461    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:11.373468    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:11.373472    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:11.376182    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:11.874612    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:11.874624    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:11.874630    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:11.874634    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:11.876432    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:12.374085    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:12.374098    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:12.374104    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:12.374107    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:12.376039    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:12.874234    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:12.874246    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:12.874252    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:12.874255    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:12.876210    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:13.374284    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:13.374372    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:13.374387    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:13.374396    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:13.377959    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:13.378403    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:13.873814    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:13.873839    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:13.873850    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:13.873856    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:13.876640    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:14.373497    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:14.373550    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:14.373561    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:14.373570    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:14.376681    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:14.872976    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:14.873065    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:14.873079    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:14.873087    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:14.875607    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:15.373684    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:15.373702    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:15.373711    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:15.373716    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:15.375839    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:15.873002    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:15.873028    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:15.873040    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:15.873049    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:15.876311    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:15.877408    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:16.373017    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:16.373044    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:16.373110    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:16.373119    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:16.376651    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:16.873932    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:16.873951    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:16.873958    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:16.873961    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:16.875945    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:17.372883    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:17.372963    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:17.372979    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:17.372987    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:17.375706    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:17.874312    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:17.874334    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:17.874343    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:17.874381    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:17.876575    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:18.374077    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:18.374176    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:18.374191    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:18.374197    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:18.377131    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:18.377505    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:18.874567    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:18.874589    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:18.874653    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:18.874658    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:18.877221    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:19.373331    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:19.373347    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:19.373387    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:19.373392    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:19.375412    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:19.873283    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:19.873307    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:19.873320    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:19.873326    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:19.876694    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.373050    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:20.373075    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:20.373086    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:20.373096    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:20.376371    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.874379    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:20.874402    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:20.874414    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:20.874421    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:20.877609    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.878167    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:21.373483    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:21.373509    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:21.373520    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:21.373526    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:21.376649    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:21.872794    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:21.872825    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:21.872832    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:21.872837    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:21.874864    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:22.373703    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:22.373721    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:22.373733    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:22.373739    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:22.376275    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:22.872731    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:22.872746    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:22.872752    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:22.872756    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:22.875078    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:23.373989    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:23.374007    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:23.374017    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:23.374021    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:23.376252    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:23.376876    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:23.874071    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:23.874095    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:23.874118    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:23.874128    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:23.877415    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:24.373797    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:24.373828    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:24.373836    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:24.373842    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:24.375723    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:24.873198    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:24.873217    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:24.873239    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:24.873242    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:24.874997    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:25.373864    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:25.373964    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:25.373983    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:25.373993    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:25.376940    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:25.377783    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:25.873066    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:25.873140    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:25.873157    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:25.873167    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:25.876035    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:26.373560    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:26.373582    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:26.373594    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:26.373600    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:26.376763    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:26.872802    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:26.872826    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:26.872847    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:26.872855    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:26.875665    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.372793    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.372848    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.372859    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.372865    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.375283    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.872817    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.872887    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.872897    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.872902    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.875143    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.875477    2954 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 09:55:27.875491    2954 node_ready.go:38] duration metric: took 19.002910931s for node "ha-393000-m02" to be "Ready" ...
	I0731 09:55:27.875498    2954 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:55:27.875539    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:27.875545    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.875550    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.875554    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.884028    2954 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 09:55:27.888275    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.888338    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 09:55:27.888344    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.888351    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.888354    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.895154    2954 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 09:55:27.895668    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.895676    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.895682    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.895685    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.903221    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:55:27.903585    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.903594    2954 pod_ready.go:81] duration metric: took 15.30431ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.903601    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.903644    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 09:55:27.903649    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.903655    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.903659    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.910903    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:55:27.911272    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.911279    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.911284    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.911287    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.912846    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.913176    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.913184    2954 pod_ready.go:81] duration metric: took 9.57768ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.913191    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.913223    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 09:55:27.913228    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.913233    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.913237    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.914947    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.915374    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.915380    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.915386    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.915390    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.916800    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.917134    2954 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.917142    2954 pod_ready.go:81] duration metric: took 3.945951ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.917148    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.917182    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 09:55:27.917186    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.917192    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.917199    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.919108    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.919519    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.919526    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.919532    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.919538    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.920909    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.921212    2954 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.921221    2954 pod_ready.go:81] duration metric: took 4.068426ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.921231    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.073440    2954 request.go:629] Waited for 152.136555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:55:28.073539    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:55:28.073547    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.073555    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.073561    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.075944    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:28.272878    2954 request.go:629] Waited for 196.473522ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:28.272966    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:28.272972    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.272978    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.272981    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.274914    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:28.275308    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:28.275318    2954 pod_ready.go:81] duration metric: took 354.084518ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.275325    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.473409    2954 request.go:629] Waited for 198.051207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:55:28.473441    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:55:28.473447    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.473463    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.473467    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.475323    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:28.673703    2954 request.go:629] Waited for 197.835098ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:28.673754    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:28.673765    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.673772    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.673777    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.676049    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:28.676485    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:28.676497    2954 pod_ready.go:81] duration metric: took 401.169334ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.676504    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.874899    2954 request.go:629] Waited for 198.343236ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:55:28.875005    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:55:28.875014    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.875025    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.875031    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.878371    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:29.072894    2954 request.go:629] Waited for 193.894527ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:29.072997    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:29.073009    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.073020    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.073029    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.075911    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.076354    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.076367    2954 pod_ready.go:81] duration metric: took 399.859987ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.076376    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.273708    2954 request.go:629] Waited for 197.294345ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:55:29.273758    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:55:29.273806    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.273815    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.273819    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.276500    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.473244    2954 request.go:629] Waited for 196.211404ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.473347    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.473355    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.473363    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.473367    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.475855    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.476256    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.476266    2954 pod_ready.go:81] duration metric: took 399.888458ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.476273    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.672987    2954 request.go:629] Waited for 196.670765ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:55:29.673094    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:55:29.673114    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.673128    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.673135    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.676240    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:29.874264    2954 request.go:629] Waited for 197.423472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.874348    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.874352    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.874365    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.874369    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.876229    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:29.876542    2954 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.876551    2954 pod_ready.go:81] duration metric: took 400.273525ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.876557    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.073985    2954 request.go:629] Waited for 197.386483ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:55:30.074064    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:55:30.074071    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.074076    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.074080    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.075934    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:30.274353    2954 request.go:629] Waited for 197.921759ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.274399    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.274408    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.274421    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.274429    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.276767    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:30.277075    2954 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:30.277085    2954 pod_ready.go:81] duration metric: took 400.525562ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.277092    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.474867    2954 request.go:629] Waited for 197.733458ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:55:30.474919    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:55:30.474936    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.474949    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.474958    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.478180    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:30.673620    2954 request.go:629] Waited for 194.924994ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.673658    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.673662    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.673668    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.673674    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.675356    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:30.675625    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:30.675634    2954 pod_ready.go:81] duration metric: took 398.539654ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.675640    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.873712    2954 request.go:629] Waited for 198.03899ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:55:30.873795    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:55:30.873801    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.873807    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.873811    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.875750    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:31.074152    2954 request.go:629] Waited for 197.932145ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:31.074207    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:31.074215    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.074227    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.074234    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.077132    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:31.077723    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:31.077735    2954 pod_ready.go:81] duration metric: took 402.091925ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:31.077744    2954 pod_ready.go:38] duration metric: took 3.202266702s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:55:31.077770    2954 api_server.go:52] waiting for apiserver process to appear ...
	I0731 09:55:31.077872    2954 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:55:31.089706    2954 api_server.go:72] duration metric: took 22.549827849s to wait for apiserver process to appear ...
	I0731 09:55:31.089719    2954 api_server.go:88] waiting for apiserver healthz status ...
	I0731 09:55:31.089735    2954 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:55:31.093731    2954 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:55:31.093774    2954 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 09:55:31.093779    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.093785    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.093789    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.094287    2954 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 09:55:31.094337    2954 api_server.go:141] control plane version: v1.30.3
	I0731 09:55:31.094346    2954 api_server.go:131] duration metric: took 4.622445ms to wait for apiserver health ...
	I0731 09:55:31.094351    2954 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 09:55:31.272834    2954 request.go:629] Waited for 178.447514ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.272864    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.272868    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.272874    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.272879    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.275929    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:31.278922    2954 system_pods.go:59] 17 kube-system pods found
	I0731 09:55:31.278939    2954 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:55:31.278943    2954 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:55:31.278948    2954 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:55:31.278951    2954 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:55:31.278954    2954 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:55:31.278957    2954 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:55:31.278960    2954 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:55:31.278963    2954 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:55:31.278966    2954 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:55:31.278968    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:55:31.278971    2954 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:55:31.278973    2954 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:55:31.278976    2954 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:55:31.278982    2954 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:55:31.278986    2954 system_pods.go:61] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:55:31.278988    2954 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:55:31.278991    2954 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:55:31.278996    2954 system_pods.go:74] duration metric: took 184.642078ms to wait for pod list to return data ...
	I0731 09:55:31.279002    2954 default_sa.go:34] waiting for default service account to be created ...
	I0731 09:55:31.473455    2954 request.go:629] Waited for 194.413647ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:55:31.473487    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:55:31.473492    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.473498    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.473502    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.475460    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:31.475608    2954 default_sa.go:45] found service account: "default"
	I0731 09:55:31.475618    2954 default_sa.go:55] duration metric: took 196.612794ms for default service account to be created ...
	I0731 09:55:31.475624    2954 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 09:55:31.673326    2954 request.go:629] Waited for 197.663631ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.673362    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.673369    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.673377    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.673384    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.676582    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:31.680023    2954 system_pods.go:86] 17 kube-system pods found
	I0731 09:55:31.680035    2954 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:55:31.680039    2954 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:55:31.680042    2954 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:55:31.680045    2954 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:55:31.680048    2954 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:55:31.680051    2954 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:55:31.680054    2954 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:55:31.680057    2954 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:55:31.680060    2954 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:55:31.680063    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:55:31.680067    2954 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:55:31.680070    2954 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:55:31.680073    2954 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:55:31.680076    2954 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:55:31.680079    2954 system_pods.go:89] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:55:31.680082    2954 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:55:31.680085    2954 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:55:31.680089    2954 system_pods.go:126] duration metric: took 204.462284ms to wait for k8s-apps to be running ...
	I0731 09:55:31.680093    2954 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 09:55:31.680137    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:55:31.691384    2954 system_svc.go:56] duration metric: took 11.279108ms WaitForService to wait for kubelet
	I0731 09:55:31.691399    2954 kubeadm.go:582] duration metric: took 23.151526974s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:55:31.691411    2954 node_conditions.go:102] verifying NodePressure condition ...
	I0731 09:55:31.872842    2954 request.go:629] Waited for 181.393446ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 09:55:31.872873    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 09:55:31.872877    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.872884    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.872887    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.875560    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:31.876076    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:55:31.876090    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:55:31.876101    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:55:31.876111    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:55:31.876115    2954 node_conditions.go:105] duration metric: took 184.70211ms to run NodePressure ...
	I0731 09:55:31.876123    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:55:31.876138    2954 start.go:255] writing updated cluster config ...
	I0731 09:55:31.896708    2954 out.go:177] 
	I0731 09:55:31.917824    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:31.917916    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:31.939502    2954 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 09:55:31.981501    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:55:31.981523    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:55:31.981705    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:55:31.981717    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:55:31.981841    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:31.982574    2954 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:55:31.982642    2954 start.go:364] duration metric: took 52.194µs to acquireMachinesLock for "ha-393000-m03"
	I0731 09:55:31.982663    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ing
ress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror:
DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m03 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:31.982776    2954 start.go:125] createHost starting for "m03" (driver="hyperkit")
	I0731 09:55:32.003523    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:55:32.003599    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:32.003626    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:32.012279    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51131
	I0731 09:55:32.012622    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:32.012991    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:32.013008    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:32.013225    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:32.013332    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:32.013417    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:32.013511    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:55:32.013531    2954 client.go:168] LocalClient.Create starting
	I0731 09:55:32.013562    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:55:32.013605    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:55:32.013616    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:55:32.013658    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:55:32.013685    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:55:32.013695    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:55:32.013708    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:55:32.013722    2954 main.go:141] libmachine: (ha-393000-m03) Calling .PreCreateCheck
	I0731 09:55:32.013796    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:32.013821    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:32.024803    2954 main.go:141] libmachine: Creating machine...
	I0731 09:55:32.024819    2954 main.go:141] libmachine: (ha-393000-m03) Calling .Create
	I0731 09:55:32.024954    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:32.025189    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.024948    2993 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:55:32.025311    2954 main.go:141] libmachine: (ha-393000-m03) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:55:32.387382    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.387300    2993 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa...
	I0731 09:55:32.468181    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.468125    2993 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk...
	I0731 09:55:32.468207    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Writing magic tar header
	I0731 09:55:32.468229    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Writing SSH key tar header
	I0731 09:55:32.468792    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.468762    2993 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03 ...
	I0731 09:55:33.078663    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:33.078680    2954 main.go:141] libmachine: (ha-393000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 09:55:33.078716    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 09:55:33.103258    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 09:55:33.103280    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:55:33.103347    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:55:33.103394    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:55:33.103443    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:55:33.103490    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:55:33.103507    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:55:33.106351    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Pid is 2994
	I0731 09:55:33.106790    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 09:55:33.106810    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:33.106894    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:33.107878    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:33.107923    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:33.107940    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:33.107959    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:33.107977    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:33.107995    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:33.108059    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:33.114040    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:55:33.122160    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:55:33.123003    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:55:33.123036    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:55:33.123053    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:55:33.123062    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:55:33.505461    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:55:33.505481    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:55:33.620173    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:55:33.620193    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:55:33.620213    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:55:33.620225    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:55:33.621055    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:55:33.621064    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:55:35.108561    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 1
	I0731 09:55:35.108578    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:35.108664    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:35.109476    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:35.109527    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:35.109535    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:35.109543    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:35.109553    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:35.109564    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:35.109588    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:37.111452    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 2
	I0731 09:55:37.111469    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:37.111534    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:37.112347    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:37.112387    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:37.112400    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:37.112409    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:37.112418    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:37.112431    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:37.112438    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:39.113861    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 3
	I0731 09:55:39.113876    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:39.113989    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:39.114793    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:39.114841    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:39.114854    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:39.114871    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:39.114881    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:39.114894    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:39.114910    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:39.197635    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:55:39.197744    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:55:39.197756    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:55:39.222062    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:55:41.116408    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 4
	I0731 09:55:41.116425    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:41.116529    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:41.117328    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:41.117368    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:41.117376    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:41.117399    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:41.117416    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:41.117425    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:41.117441    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:43.117722    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 5
	I0731 09:55:43.117737    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:43.117828    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:43.118651    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:43.118699    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:55:43.118714    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:55:43.118721    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 09:55:43.118726    2954 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 09:55:43.118795    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:43.119393    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:43.119491    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:43.119572    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:55:43.119580    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:55:43.119659    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:43.119724    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:43.120517    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:55:43.120525    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:55:43.120529    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:55:43.120540    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:43.120627    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:43.120733    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:43.120830    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:43.120937    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:43.121066    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:43.121248    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:43.121256    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:55:44.180872    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:55:44.180885    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:55:44.180891    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.181020    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.181119    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.181200    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.181293    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.181426    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.181579    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.181587    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:55:44.244214    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:55:44.244264    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:55:44.244271    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:55:44.244277    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.244401    2954 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 09:55:44.244413    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.244502    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.244591    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.244669    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.244754    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.244838    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.244957    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.245103    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.245112    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 09:55:44.315698    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 09:55:44.315714    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.315853    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.315950    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.316034    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.316117    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.316237    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.316383    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.316394    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:55:44.383039    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:55:44.383055    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:55:44.383064    2954 buildroot.go:174] setting up certificates
	I0731 09:55:44.383071    2954 provision.go:84] configureAuth start
	I0731 09:55:44.383077    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.383215    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:44.383314    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.383408    2954 provision.go:143] copyHostCerts
	I0731 09:55:44.383435    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:55:44.383482    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:55:44.383490    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:55:44.383608    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:55:44.383821    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:55:44.383853    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:55:44.383859    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:55:44.383930    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:55:44.384107    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:55:44.384137    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:55:44.384146    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:55:44.384214    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:55:44.384364    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 09:55:44.436199    2954 provision.go:177] copyRemoteCerts
	I0731 09:55:44.436250    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:55:44.436265    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.436405    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.436484    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.436578    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.436651    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:44.474166    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:55:44.474251    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:55:44.495026    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:55:44.495089    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:55:44.514528    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:55:44.514597    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:55:44.534382    2954 provision.go:87] duration metric: took 151.304295ms to configureAuth
	I0731 09:55:44.534397    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:55:44.534572    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:44.534587    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:44.534721    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.534815    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.534895    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.534982    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.535063    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.535176    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.535303    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.535311    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:55:44.595832    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:55:44.595845    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:55:44.595915    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:55:44.595926    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.596055    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.596141    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.596224    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.596312    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.596436    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.596585    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.596629    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:55:44.668428    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:55:44.668446    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.668587    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.668687    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.668775    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.668883    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.669009    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.669153    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.669165    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:55:46.245712    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:55:46.245728    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:55:46.245733    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetURL
	I0731 09:55:46.245877    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:55:46.245886    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:55:46.245891    2954 client.go:171] duration metric: took 14.176451747s to LocalClient.Create
	I0731 09:55:46.245904    2954 start.go:167] duration metric: took 14.176491485s to libmachine.API.Create "ha-393000"
	I0731 09:55:46.245910    2954 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 09:55:46.245917    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:55:46.245936    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.246092    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:55:46.246107    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.246216    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.246326    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.246431    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.246511    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:46.290725    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:55:46.294553    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:55:46.294567    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:55:46.294659    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:55:46.294805    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:55:46.294812    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:55:46.294995    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:55:46.303032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:55:46.335630    2954 start.go:296] duration metric: took 89.711926ms for postStartSetup
	I0731 09:55:46.335676    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:46.336339    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:46.336499    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:46.336864    2954 start.go:128] duration metric: took 14.298177246s to createHost
	I0731 09:55:46.336879    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.336971    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.337062    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.337141    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.337213    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.337332    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:46.337451    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:46.337458    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:55:46.398217    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444946.512017695
	
	I0731 09:55:46.398229    2954 fix.go:216] guest clock: 1722444946.512017695
	I0731 09:55:46.398235    2954 fix.go:229] Guest: 2024-07-31 09:55:46.512017695 -0700 PDT Remote: 2024-07-31 09:55:46.336873 -0700 PDT m=+150.181968458 (delta=175.144695ms)
	I0731 09:55:46.398245    2954 fix.go:200] guest clock delta is within tolerance: 175.144695ms
	I0731 09:55:46.398250    2954 start.go:83] releasing machines lock for "ha-393000-m03", held for 14.359697621s
	I0731 09:55:46.398269    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.398407    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:46.418329    2954 out.go:177] * Found network options:
	I0731 09:55:46.439149    2954 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 09:55:46.477220    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 09:55:46.477241    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:55:46.477255    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.477897    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.478058    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.478150    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:55:46.478196    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 09:55:46.478232    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 09:55:46.478262    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:55:46.478353    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 09:55:46.478353    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.478369    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.478511    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.478558    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.478670    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.478731    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.478785    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:46.478828    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.478931    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 09:55:46.512520    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:55:46.512591    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:55:46.558288    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:55:46.558305    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:55:46.558391    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:55:46.574105    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:55:46.582997    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:55:46.591920    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:55:46.591969    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:55:46.600962    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:55:46.610057    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:55:46.619019    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:55:46.627876    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:55:46.637129    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:55:46.646079    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:55:46.655162    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:55:46.664198    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:55:46.672256    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:55:46.680371    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:46.778919    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:55:46.798064    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:55:46.798132    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:55:46.815390    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:55:46.827644    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:55:46.842559    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:55:46.853790    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:55:46.864444    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:55:46.887653    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:55:46.898070    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:55:46.913256    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:55:46.916263    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:55:46.923424    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:55:46.937344    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:55:47.035092    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:55:47.134788    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:55:47.134810    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:55:47.149022    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:47.247660    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:55:49.540717    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.293040269s)
	I0731 09:55:49.540778    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:55:49.551148    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:55:49.563946    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:55:49.574438    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:55:49.675905    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:55:49.777958    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:49.889335    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:55:49.903338    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:55:49.914450    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:50.020127    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:55:50.079269    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:55:50.079351    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:55:50.085411    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:55:50.085468    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:55:50.088527    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:55:50.115874    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:55:50.115947    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:55:50.133371    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:55:50.177817    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:55:50.199409    2954 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 09:55:50.242341    2954 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 09:55:50.263457    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:50.263780    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:55:50.267924    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:55:50.277257    2954 mustload.go:65] Loading cluster: ha-393000
	I0731 09:55:50.277434    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:50.277675    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:50.277699    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:50.286469    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51154
	I0731 09:55:50.286803    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:50.287152    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:50.287174    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:50.287405    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:50.287529    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:55:50.287619    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:50.287687    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:55:50.288682    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:55:50.288947    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:50.288976    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:50.297641    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51156
	I0731 09:55:50.297976    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:50.298336    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:50.298356    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:50.298557    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:50.298695    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:55:50.298796    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 09:55:50.298803    2954 certs.go:194] generating shared ca certs ...
	I0731 09:55:50.298815    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.298953    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:55:50.299004    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:55:50.299013    2954 certs.go:256] generating profile certs ...
	I0731 09:55:50.299104    2954 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:55:50.299126    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 09:55:50.299146    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0731 09:55:50.438174    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb ...
	I0731 09:55:50.438189    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb: {Name:mk221449ac60933abd0b425ad947a6ab1580c0ce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.438543    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb ...
	I0731 09:55:50.438553    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb: {Name:mk1cb7896668e4a7a9edaf8893989143a67a7948 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.438773    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:55:50.438957    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:55:50.439187    2954 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:55:50.439201    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:55:50.439224    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:55:50.439243    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:55:50.439262    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:55:50.439280    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:55:50.439299    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:55:50.439317    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:55:50.439334    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:55:50.439423    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:55:50.439459    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:55:50.439466    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:55:50.439503    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:55:50.439532    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:55:50.439561    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:55:50.439623    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:55:50.439662    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.439683    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.439702    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.439730    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:55:50.439869    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:55:50.439971    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:55:50.440060    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:55:50.440149    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:55:50.470145    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 09:55:50.473304    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 09:55:50.482843    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 09:55:50.486120    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 09:55:50.495117    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 09:55:50.498266    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 09:55:50.507788    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 09:55:50.510913    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 09:55:50.519933    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 09:55:50.523042    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 09:55:50.531891    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 09:55:50.535096    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 09:55:50.544058    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:55:50.564330    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:55:50.585250    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:55:50.605412    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:55:50.625492    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1444 bytes)
	I0731 09:55:50.645935    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 09:55:50.666578    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:55:50.686734    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:55:50.707428    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:55:50.728977    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:55:50.749365    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:55:50.769217    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 09:55:50.782635    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 09:55:50.796452    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 09:55:50.810265    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 09:55:50.823856    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 09:55:50.837713    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 09:55:50.851806    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 09:55:50.865643    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:55:50.869985    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:55:50.878755    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.882092    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.882127    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.886361    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:55:50.894800    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:55:50.903511    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.906902    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.906941    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.911184    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:55:50.919457    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:55:50.927999    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.931344    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.931398    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.935641    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:55:50.944150    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:55:50.947330    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:55:50.947373    2954 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 09:55:50.947432    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:55:50.947450    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:55:50.947488    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:55:50.960195    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:55:50.960253    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:55:50.960307    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:55:50.968017    2954 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 09:55:50.968069    2954 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 09:55:50.975489    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 09:55:50.975509    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:55:50.975519    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:55:50.975557    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:55:50.976020    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:55:50.987294    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:55:50.987330    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 09:55:50.987350    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 09:55:50.987377    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 09:55:50.987399    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 09:55:50.987416    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:55:51.010057    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 09:55:51.010100    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 09:55:51.683575    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 09:55:51.690828    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 09:55:51.704403    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:55:51.718184    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 09:55:51.732058    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:55:51.735039    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:55:51.744606    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:51.842284    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:55:51.858313    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:55:51.858589    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:51.858612    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:51.867825    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51159
	I0731 09:55:51.868326    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:51.868657    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:51.868668    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:51.868882    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:51.868991    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:55:51.869077    2954 start.go:317] joinCluster: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 Clu
sterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:55:51.869219    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0731 09:55:51.869241    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:55:51.869330    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:55:51.869408    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:55:51.869497    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:55:51.869579    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:55:51.957634    2954 start.go:343] trying to join control-plane node "m03" to cluster: &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:51.957691    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 3o7i0i.qey1hcj8w6i3nuyy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443"
	I0731 09:56:20.527748    2954 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 3o7i0i.qey1hcj8w6i3nuyy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443": (28.570050327s)
	I0731 09:56:20.527779    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0731 09:56:20.987700    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000-m03 minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=false
	I0731 09:56:21.064233    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-393000-m03 node-role.kubernetes.io/control-plane:NoSchedule-
	I0731 09:56:21.148165    2954 start.go:319] duration metric: took 29.279096383s to joinCluster
	I0731 09:56:21.148219    2954 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:56:21.148483    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:56:21.189791    2954 out.go:177] * Verifying Kubernetes components...
	I0731 09:56:21.248129    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:56:21.485219    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:56:21.507788    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:56:21.508040    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 09:56:21.508088    2954 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 09:56:21.508300    2954 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 09:56:21.508342    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:21.508347    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:21.508353    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:21.508357    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:21.510586    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:22.008706    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:22.008723    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:22.008734    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:22.008738    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:22.010978    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:22.509350    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:22.509366    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:22.509372    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:22.509375    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:22.511656    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:23.009510    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:23.009526    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:23.009532    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:23.009535    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:23.011420    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:23.508500    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:23.508516    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:23.508523    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:23.508526    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:23.510720    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:23.511145    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:24.009377    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:24.009394    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:24.009439    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:24.009443    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:24.011828    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:24.509345    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:24.509361    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:24.509368    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:24.509372    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:24.511614    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:25.009402    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:25.009418    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:25.009424    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:25.009428    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:25.011344    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:25.508774    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:25.508790    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:25.508797    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:25.508800    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:25.510932    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:25.511292    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:26.008449    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:26.008465    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:26.008471    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:26.008474    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:26.010614    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:26.509754    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:26.509786    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:26.509799    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:26.509805    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:26.512347    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:27.008498    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:27.008592    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:27.008608    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:27.008615    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:27.011956    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:27.509028    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:27.509110    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:27.509125    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:27.509132    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:27.512133    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:27.512700    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:28.008990    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:28.009083    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:28.009097    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:28.009103    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:28.012126    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:28.509594    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:28.509612    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:28.509621    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:28.509625    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:28.512206    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:29.009613    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:29.009628    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:29.009634    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:29.009637    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:29.011661    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:29.509044    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:29.509059    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:29.509065    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:29.509068    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:29.511159    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:30.008831    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:30.008905    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:30.008916    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:30.008922    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:30.011246    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:30.011529    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:30.509817    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:30.509832    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:30.509838    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:30.509846    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:30.511920    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:31.008461    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:31.008483    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:31.008493    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:31.008499    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:31.011053    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:31.509184    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:31.509236    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:31.509247    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:31.509252    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:31.511776    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:32.008486    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:32.008510    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:32.008522    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:32.008531    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:32.011649    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:32.012066    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:32.510023    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:32.510037    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:32.510044    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:32.510048    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:32.512097    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:33.010283    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:33.010301    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:33.010310    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:33.010314    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:33.012927    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:33.509693    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:33.509712    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:33.509722    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:33.509726    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:33.512086    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.008568    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:34.008586    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:34.008594    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:34.008599    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:34.010823    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.509266    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:34.509365    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:34.509380    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:34.509386    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:34.512417    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.512850    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:35.009777    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:35.009792    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:35.009799    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:35.009802    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:35.011859    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:35.508525    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:35.508582    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:35.508590    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:35.508596    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:35.510810    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:36.009838    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:36.009864    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:36.009876    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:36.009881    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:36.012816    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:36.509201    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:36.509215    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:36.509265    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:36.509269    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:36.511244    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:37.010038    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:37.010064    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:37.010077    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:37.010083    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:37.013339    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:37.013728    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:37.509315    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:37.509330    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:37.509336    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:37.509339    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:37.511753    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:38.009336    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:38.009405    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:38.009415    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:38.009428    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:38.011725    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:38.508458    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:38.508483    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:38.508493    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:38.508500    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:38.511720    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:39.008429    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:39.008452    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:39.008459    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:39.008463    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:39.010408    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:39.508530    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:39.508555    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:39.508569    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:39.508577    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:39.511916    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:39.512435    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:40.009629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.009648    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.009663    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.009668    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.011742    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.509939    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.509963    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.509976    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.509982    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.512891    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.513173    2954 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 09:56:40.513182    2954 node_ready.go:38] duration metric: took 19.004877925s for node "ha-393000-m03" to be "Ready" ...
	I0731 09:56:40.513193    2954 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:56:40.513230    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:40.513235    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.513241    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.513244    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.517063    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:40.521698    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.521758    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 09:56:40.521763    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.521769    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.521773    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.524012    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.524507    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.524515    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.524521    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.524525    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.526095    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.526522    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.526532    2954 pod_ready.go:81] duration metric: took 4.820449ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.526539    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.526579    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 09:56:40.526584    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.526589    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.526597    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.528189    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.528737    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.528744    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.528750    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.528754    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.530442    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.530775    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.530784    2954 pod_ready.go:81] duration metric: took 4.239462ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.530790    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.530822    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 09:56:40.530827    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.530833    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.530840    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.532590    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.533050    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.533057    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.533062    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.533066    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.534760    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.535110    2954 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.535119    2954 pod_ready.go:81] duration metric: took 4.323936ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.535125    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.535164    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 09:56:40.535170    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.535175    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.535178    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.536947    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.537444    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:40.537451    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.537456    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.537460    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.539136    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.539571    2954 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.539580    2954 pod_ready.go:81] duration metric: took 4.45006ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.539587    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.710116    2954 request.go:629] Waited for 170.494917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 09:56:40.710174    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 09:56:40.710180    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.710187    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.710190    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.712323    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.910582    2954 request.go:629] Waited for 197.870555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.910719    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.910732    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.910743    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.910750    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.913867    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:40.914265    2954 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.914278    2954 pod_ready.go:81] duration metric: took 374.68494ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.914293    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.110758    2954 request.go:629] Waited for 196.414025ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:56:41.110829    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:56:41.110835    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.110841    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.110844    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.112890    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:41.311962    2954 request.go:629] Waited for 198.609388ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:41.311995    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:41.312000    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.312006    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.312010    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.314041    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:41.314399    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:41.314410    2954 pod_ready.go:81] duration metric: took 400.109149ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.314418    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.511371    2954 request.go:629] Waited for 196.905615ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:56:41.511497    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:56:41.511508    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.511519    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.511526    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.514702    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:41.710099    2954 request.go:629] Waited for 194.801702ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:41.710131    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:41.710137    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.710143    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.710148    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.711902    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:41.712201    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:41.712211    2954 pod_ready.go:81] duration metric: took 397.788368ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.712225    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.910472    2954 request.go:629] Waited for 198.191914ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 09:56:41.910629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 09:56:41.910640    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.910651    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.910657    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.913895    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:42.111114    2954 request.go:629] Waited for 196.678487ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:42.111206    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:42.111214    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.111222    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.111228    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.113500    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.113867    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.113876    2954 pod_ready.go:81] duration metric: took 401.646528ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.113883    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.310054    2954 request.go:629] Waited for 196.129077ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:56:42.310144    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:56:42.310151    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.310157    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.310161    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.312081    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:42.510104    2954 request.go:629] Waited for 197.491787ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:42.510220    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:42.510230    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.510241    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.510249    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.512958    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.513508    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.513521    2954 pod_ready.go:81] duration metric: took 399.632057ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.513531    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.710421    2954 request.go:629] Waited for 196.851281ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:56:42.710510    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:56:42.710517    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.710523    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.710527    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.713018    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.910158    2954 request.go:629] Waited for 196.774024ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:42.910295    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:42.910307    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.910319    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.910327    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.913021    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.913406    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.913416    2954 pod_ready.go:81] duration metric: took 399.880068ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.913423    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.110445    2954 request.go:629] Waited for 196.965043ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 09:56:43.110548    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 09:56:43.110603    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.110615    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.110630    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.113588    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.311083    2954 request.go:629] Waited for 196.925492ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:43.311134    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:43.311139    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.311146    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.311149    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.313184    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.313462    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:43.313472    2954 pod_ready.go:81] duration metric: took 400.04465ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.313479    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.510584    2954 request.go:629] Waited for 197.060501ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:56:43.510710    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:56:43.510722    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.510731    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.510737    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.513575    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.710025    2954 request.go:629] Waited for 195.991998ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:43.710104    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:43.710111    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.710117    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.710121    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.712314    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.712653    2954 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:43.712663    2954 pod_ready.go:81] duration metric: took 399.178979ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.712670    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.910041    2954 request.go:629] Waited for 197.319656ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 09:56:43.910085    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 09:56:43.910092    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.910100    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.910108    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.913033    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.110409    2954 request.go:629] Waited for 196.775647ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:44.110512    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:44.110520    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.110526    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.110530    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.112726    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.113050    2954 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.113060    2954 pod_ready.go:81] duration metric: took 400.385455ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.113067    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.310143    2954 request.go:629] Waited for 197.043092ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:56:44.310236    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:56:44.310243    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.310253    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.310258    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.312471    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.510561    2954 request.go:629] Waited for 197.642859ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.510715    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.510728    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.510742    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.510750    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.513815    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:44.514349    2954 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.514363    2954 pod_ready.go:81] duration metric: took 401.290361ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.514372    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.711407    2954 request.go:629] Waited for 196.995177ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:56:44.711475    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:56:44.711482    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.711488    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.711491    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.713573    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.910056    2954 request.go:629] Waited for 196.042855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.910095    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.910103    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.910112    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.910117    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.912608    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.912924    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.912934    2954 pod_ready.go:81] duration metric: took 398.555138ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.912941    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.112001    2954 request.go:629] Waited for 199.012783ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:56:45.112114    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:56:45.112125    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.112136    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.112142    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.115328    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:45.310138    2954 request.go:629] Waited for 194.249421ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:45.310197    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:45.310207    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.310217    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.310226    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.315131    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:45.315432    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:45.315442    2954 pod_ready.go:81] duration metric: took 402.495485ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.315449    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.510510    2954 request.go:629] Waited for 195.017136ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 09:56:45.510595    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 09:56:45.510601    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.510607    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.510614    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.512663    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:45.709970    2954 request.go:629] Waited for 196.900157ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:45.710056    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:45.710063    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.710069    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.710073    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.712279    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:45.712540    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:45.712550    2954 pod_ready.go:81] duration metric: took 397.095893ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.712557    2954 pod_ready.go:38] duration metric: took 5.199358243s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:56:45.712568    2954 api_server.go:52] waiting for apiserver process to appear ...
	I0731 09:56:45.712620    2954 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:56:45.724210    2954 api_server.go:72] duration metric: took 24.575970869s to wait for apiserver process to appear ...
	I0731 09:56:45.724224    2954 api_server.go:88] waiting for apiserver healthz status ...
	I0731 09:56:45.724236    2954 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:56:45.729801    2954 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:56:45.729848    2954 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 09:56:45.729855    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.729862    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.729867    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.731097    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:45.731132    2954 api_server.go:141] control plane version: v1.30.3
	I0731 09:56:45.731141    2954 api_server.go:131] duration metric: took 6.912618ms to wait for apiserver health ...
	I0731 09:56:45.731147    2954 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 09:56:45.910423    2954 request.go:629] Waited for 179.236536ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:45.910520    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:45.910529    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.910537    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.910541    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.914926    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:45.919715    2954 system_pods.go:59] 24 kube-system pods found
	I0731 09:56:45.919728    2954 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:56:45.919732    2954 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:56:45.919735    2954 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:56:45.919738    2954 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:56:45.919742    2954 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 09:56:45.919745    2954 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:56:45.919748    2954 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:56:45.919750    2954 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 09:56:45.919753    2954 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:56:45.919756    2954 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:56:45.919759    2954 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 09:56:45.919761    2954 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:56:45.919764    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:56:45.919767    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 09:56:45.919770    2954 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:56:45.919773    2954 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 09:56:45.919776    2954 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:56:45.919778    2954 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:56:45.919780    2954 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:56:45.919783    2954 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 09:56:45.919785    2954 system_pods.go:61] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:56:45.919789    2954 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:56:45.919792    2954 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 09:56:45.919795    2954 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:56:45.919799    2954 system_pods.go:74] duration metric: took 188.647794ms to wait for pod list to return data ...
	I0731 09:56:45.919808    2954 default_sa.go:34] waiting for default service account to be created ...
	I0731 09:56:46.110503    2954 request.go:629] Waited for 190.648848ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:56:46.110629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:56:46.110641    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.110653    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.110659    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.113864    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:46.113948    2954 default_sa.go:45] found service account: "default"
	I0731 09:56:46.113959    2954 default_sa.go:55] duration metric: took 194.145984ms for default service account to be created ...
	I0731 09:56:46.113966    2954 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 09:56:46.310339    2954 request.go:629] Waited for 196.331355ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:46.310381    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:46.310387    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.310420    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.310424    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.314581    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:46.318894    2954 system_pods.go:86] 24 kube-system pods found
	I0731 09:56:46.318910    2954 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:56:46.318914    2954 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:56:46.318918    2954 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:56:46.318921    2954 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:56:46.318926    2954 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 09:56:46.318931    2954 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:56:46.318934    2954 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:56:46.318939    2954 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 09:56:46.318942    2954 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:56:46.318946    2954 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:56:46.318950    2954 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 09:56:46.318955    2954 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:56:46.318958    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:56:46.318963    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 09:56:46.318966    2954 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:56:46.318970    2954 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 09:56:46.318973    2954 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:56:46.318976    2954 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:56:46.318980    2954 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:56:46.318983    2954 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 09:56:46.318987    2954 system_pods.go:89] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:56:46.318990    2954 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:56:46.318993    2954 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 09:56:46.318996    2954 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:56:46.319002    2954 system_pods.go:126] duration metric: took 205.029246ms to wait for k8s-apps to be running ...
	I0731 09:56:46.319007    2954 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 09:56:46.319063    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:56:46.330197    2954 system_svc.go:56] duration metric: took 11.183343ms WaitForService to wait for kubelet
	I0731 09:56:46.330213    2954 kubeadm.go:582] duration metric: took 25.181975511s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:56:46.330225    2954 node_conditions.go:102] verifying NodePressure condition ...
	I0731 09:56:46.509976    2954 request.go:629] Waited for 179.711714ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 09:56:46.510033    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 09:56:46.510039    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.510045    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.510049    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.512677    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:46.513343    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513352    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513358    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513361    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513364    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513367    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513371    2954 node_conditions.go:105] duration metric: took 183.142994ms to run NodePressure ...
	I0731 09:56:46.513378    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:56:46.513392    2954 start.go:255] writing updated cluster config ...
	I0731 09:56:46.513784    2954 ssh_runner.go:195] Run: rm -f paused
	I0731 09:56:46.555311    2954 start.go:600] kubectl: 1.29.2, cluster: 1.30.3 (minor skew: 1)
	I0731 09:56:46.577040    2954 out.go:177] * Done! kubectl is now configured to use "ha-393000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/25b3d6db405f49d365d6f33539e94ee4547921a7d0c463b94585056341530cda/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/c2a288a20831d0407ed1a2c3eeeb19a9758ef98813b916541258c8c58bcce38c/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/480020f5f9c0ce2e553e007beff5dfbe53b17bd2beaa73039be50701f04b9e76/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428712215Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428950502Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428960130Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.429078581Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477484798Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477564679Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477577219Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477869035Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507078466Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507147792Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507166914Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507244276Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853207982Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853706000Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853772518Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.854059851Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:56:47Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/e9ce137a2245c1333d3f3961469d32237e88656784f689211ed86cae2fd5518f/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Jul 31 16:56:49 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:56:49Z" level=info msg="Stop pulling image gcr.io/k8s-minikube/busybox:1.28: Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:1.28"
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157487366Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157549945Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157563641Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.158058722Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   About a minute ago   Running             busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         4 minutes ago        Running             coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         4 minutes ago        Running             coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	6d966e37d3618       6e38f40d628db                                                                                         4 minutes ago        Running             storage-provisioner       0                   25b3d6db405f4       storage-provisioner
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              4 minutes ago        Running             kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         4 minutes ago        Running             kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	e68314e525ef8       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     4 minutes ago        Running             kube-vip                  0                   c9f21d49b1384       kube-vip-ha-393000
	ab4f453cbe097       1f6d574d502f3                                                                                         4 minutes ago        Running             kube-apiserver            0                   7dc7f319faa98       kube-apiserver-ha-393000
	63e56744c84ee       3861cfcd7c04c                                                                                         4 minutes ago        Running             etcd                      0                   f8f20b1290499       etcd-ha-393000
	e19f7878939c9       76932a3b37d7e                                                                                         4 minutes ago        Running             kube-controller-manager   0                   67c995d2d2a3b       kube-controller-manager-ha-393000
	65412448c586b       3edc18e7b7672                                                                                         4 minutes ago        Running             kube-scheduler            0                   7ab9affa89eca       kube-scheduler-ha-393000
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:34336 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000091143s
	[INFO] 10.244.2.2:60404 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000085158s
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	
	
	==> coredns [feda36fb8a03] <==
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:43418 - 53237 "HINFO IN 5926041632293031093.721085148118182160. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.013101738s
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	
	
	==> describe nodes <==
	Name:               ha-393000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:53:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:18 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:54:24 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-393000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 baf02d554c20474b9fadb280fa1b8544
	  System UUID:                2cfe48dd-0000-0000-9b98-537ad9823a95
	  Boot ID:                    d6aa7e74-2f58-4a9d-a5df-37153dda8239
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-b94zr              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         100s
	  kube-system                 coredns-7db6d8ff4d-5m8st             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     4m22s
	  kube-system                 coredns-7db6d8ff4d-wvqjl             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     4m22s
	  kube-system                 etcd-ha-393000                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         4m36s
	  kube-system                 kindnet-hjm7c                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      4m22s
	  kube-system                 kube-apiserver-ha-393000             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m36s
	  kube-system                 kube-controller-manager-ha-393000    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m37s
	  kube-system                 kube-proxy-zc52f                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m22s
	  kube-system                 kube-scheduler-ha-393000             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m36s
	  kube-system                 kube-vip-ha-393000                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m39s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m21s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 4m21s  kube-proxy       
	  Normal  Starting                 4m36s  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  4m36s  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  4m36s  kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m36s  kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m36s  kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           4m23s  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeReady                4m3s   kubelet          Node ha-393000 status is now: NodeReady
	  Normal  RegisteredNode           3m4s   node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           112s   node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	
	
	Name:               ha-393000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:55:06 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:10 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:57:08 +0000   Wed, 31 Jul 2024 16:55:27 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-393000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 ef1036a76f3140bd891095c317498193
	  System UUID:                7863443c-0000-0000-8e8d-bbd47bc06547
	  Boot ID:                    d1d2508d-2745-4c36-9513-9d28d75304e0
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-zln22                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         100s
	  kube-system                 etcd-ha-393000-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         3m19s
	  kube-system                 kindnet-lcwbs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      3m21s
	  kube-system                 kube-apiserver-ha-393000-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m19s
	  kube-system                 kube-controller-manager-ha-393000-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m19s
	  kube-system                 kube-proxy-cf577                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m21s
	  kube-system                 kube-scheduler-ha-393000-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m19s
	  kube-system                 kube-vip-ha-393000-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m17s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 3m17s                  kube-proxy       
	  Normal  NodeHasSufficientMemory  3m21s (x8 over 3m21s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    3m21s (x8 over 3m21s)  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     3m21s (x7 over 3m21s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  3m21s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           3m18s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal  RegisteredNode           3m4s                   node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal  RegisteredNode           112s                   node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	
	
	Name:               ha-393000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:56:18 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 16:58:20 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:40 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-393000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 86f4bf9242d1461e9aec7b900dfd2277
	  System UUID:                451d42a6-0000-0000-8ccb-b8851dda0594
	  Boot ID:                    07f25a3c-b688-461e-9d49-0a60051d0c3c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-n8d7h                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         100s
	  kube-system                 etcd-ha-393000-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         2m7s
	  kube-system                 kindnet-s2pv6                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      2m9s
	  kube-system                 kube-apiserver-ha-393000-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m8s
	  kube-system                 kube-controller-manager-ha-393000-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m7s
	  kube-system                 kube-proxy-cr9pg                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m9s
	  kube-system                 kube-scheduler-ha-393000-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m8s
	  kube-system                 kube-vip-ha-393000-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m5s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                  From             Message
	  ----    ------                   ----                 ----             -------
	  Normal  Starting                 2m5s                 kube-proxy       
	  Normal  NodeHasSufficientMemory  2m9s (x8 over 2m9s)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    2m9s (x8 over 2m9s)  kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     2m9s (x7 over 2m9s)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  2m9s                 kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           2m8s                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           2m4s                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           112s                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	
	
	==> dmesg <==
	[  +2.764750] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.236579] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.776173] systemd-fstab-generator[496]: Ignoring "noauto" option for root device
	[  +0.099418] systemd-fstab-generator[508]: Ignoring "noauto" option for root device
	[  +1.822617] systemd-fstab-generator[843]: Ignoring "noauto" option for root device
	[  +0.280031] systemd-fstab-generator[881]: Ignoring "noauto" option for root device
	[  +0.062769] kauditd_printk_skb: 95 callbacks suppressed
	[  +0.051458] systemd-fstab-generator[893]: Ignoring "noauto" option for root device
	[  +0.120058] systemd-fstab-generator[907]: Ignoring "noauto" option for root device
	[  +2.468123] systemd-fstab-generator[1123]: Ignoring "noauto" option for root device
	[  +0.099873] systemd-fstab-generator[1135]: Ignoring "noauto" option for root device
	[  +0.092257] systemd-fstab-generator[1147]: Ignoring "noauto" option for root device
	[  +0.106918] systemd-fstab-generator[1162]: Ignoring "noauto" option for root device
	[  +3.770701] systemd-fstab-generator[1268]: Ignoring "noauto" option for root device
	[  +0.056009] kauditd_printk_skb: 180 callbacks suppressed
	[  +2.552095] systemd-fstab-generator[1523]: Ignoring "noauto" option for root device
	[  +4.084188] systemd-fstab-generator[1702]: Ignoring "noauto" option for root device
	[  +0.054525] kauditd_printk_skb: 70 callbacks suppressed
	[  +7.033653] systemd-fstab-generator[2202]: Ignoring "noauto" option for root device
	[  +0.072815] kauditd_printk_skb: 72 callbacks suppressed
	[Jul31 16:54] kauditd_printk_skb: 12 callbacks suppressed
	[ +19.132251] kauditd_printk_skb: 38 callbacks suppressed
	[Jul31 16:55] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [63e56744c84e] <==
	{"level":"warn","ts":"2024-07-31T16:58:26.117959Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"1c40d7bfcdf14e3b","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: i/o timeout"}
	{"level":"warn","ts":"2024-07-31T16:58:27.674582Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.774864Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.795136Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.800161Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.803396Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.809248Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.813581Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.817254Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.821334Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.823995Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.826739Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.833016Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.83709Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.841392Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.844006Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.847356Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.852314Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.856196Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.860105Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.874286Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.975037Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.992352Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:27.993439Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T16:58:28.075039Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b","remote-peer-name":"pipeline","remote-peer-active":false}
	
	
	==> kernel <==
	 16:58:28 up 5 min,  0 users,  load average: 0.54, 0.45, 0.20
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:57:40.110749       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:57:50.109884       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:57:50.109964       1 main.go:299] handling current node
	I0731 16:57:50.109980       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:57:50.110115       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:57:50.110404       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:57:50.110446       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:58:00.116121       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:58:00.116198       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:58:00.116281       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:58:00.116321       1 main.go:299] handling current node
	I0731 16:58:00.116341       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:58:00.116353       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:58:10.110132       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:58:10.110172       1 main.go:299] handling current node
	I0731 16:58:10.110185       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:58:10.110190       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:58:10.110340       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:58:10.110368       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:58:20.119116       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:58:20.119157       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:58:20.119310       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:58:20.119369       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:58:20.119485       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:58:20.119513       1 main.go:299] handling current node
	
	
	==> kube-apiserver [ab4f453cbe09] <==
	I0731 16:53:49.787246       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0731 16:53:49.838971       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0731 16:53:49.842649       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0731 16:53:49.843317       1 controller.go:615] quota admission added evaluator for: endpoints
	I0731 16:53:49.845885       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0731 16:53:50.451090       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0731 16:53:51.578858       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0731 16:53:51.587918       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0731 16:53:51.594571       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0731 16:54:05.505988       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	I0731 16:54:05.655031       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	E0731 16:56:52.014947       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51195: use of closed network connection
	E0731 16:56:52.206354       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51197: use of closed network connection
	E0731 16:56:52.403109       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51199: use of closed network connection
	E0731 16:56:52.600256       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51201: use of closed network connection
	E0731 16:56:52.785054       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51203: use of closed network connection
	E0731 16:56:53.004706       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51205: use of closed network connection
	E0731 16:56:53.208399       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51207: use of closed network connection
	E0731 16:56:53.392187       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51209: use of closed network connection
	E0731 16:56:53.714246       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51212: use of closed network connection
	E0731 16:56:53.895301       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51214: use of closed network connection
	E0731 16:56:54.078794       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51216: use of closed network connection
	E0731 16:56:54.262767       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51218: use of closed network connection
	E0731 16:56:54.448344       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51220: use of closed network connection
	E0731 16:56:54.629926       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51222: use of closed network connection
	
	
	==> kube-controller-manager [e19f7878939c] <==
	I0731 16:54:25.766270       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="26.902µs"
	I0731 16:54:29.808610       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0731 16:55:06.430472       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m02\" does not exist"
	I0731 16:55:06.448216       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m02" podCIDRs=["10.244.1.0/24"]
	I0731 16:55:09.814349       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m02"
	E0731 16:56:18.277948       1 certificate_controller.go:146] Sync csr-v42tm failed with : error updating signature for csr: Operation cannot be fulfilled on certificatesigningrequests.certificates.k8s.io "csr-v42tm": the object has been modified; please apply your changes to the latest version and try again
	I0731 16:56:18.384134       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m03\" does not exist"
	I0731 16:56:18.398095       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m03" podCIDRs=["10.244.2.0/24"]
	I0731 16:56:19.822872       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m03"
	I0731 16:56:47.522324       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="152.941157ms"
	I0731 16:56:47.574976       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="52.539469ms"
	I0731 16:56:47.678922       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="103.895055ms"
	I0731 16:56:47.701560       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="22.534098ms"
	I0731 16:56:47.701787       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="74.391µs"
	I0731 16:56:47.718186       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="8.079697ms"
	I0731 16:56:47.718269       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="39.867µs"
	I0731 16:56:47.744772       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="8.73015ms"
	I0731 16:56:47.745065       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="34.302µs"
	I0731 16:56:48.288860       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.605µs"
	I0731 16:56:49.532986       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.769402ms"
	I0731 16:56:49.533229       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="37.061µs"
	I0731 16:56:49.677499       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.411426ms"
	I0731 16:56:49.677560       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="21.894µs"
	I0731 16:56:51.343350       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="15.340858ms"
	I0731 16:56:51.343434       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.532µs"
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [65412448c586] <==
	W0731 16:53:48.491080       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0731 16:53:48.491132       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0731 16:53:48.491335       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:48.491387       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0731 16:53:48.491507       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0731 16:53:48.491594       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0731 16:53:48.491662       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:48.491738       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:48.491818       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:48.491860       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:48.491537       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:48.491873       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.319781       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0731 16:53:49.319838       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0731 16:53:49.326442       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.326478       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.392116       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:49.392172       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:49.496014       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.496036       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.541411       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:49.541927       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:49.588695       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:49.588735       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0731 16:53:49.982415       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 31 16:54:25 ha-393000 kubelet[2209]: I0731 16:54:25.725648    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-5m8st" podStartSLOduration=20.725636637 podStartE2EDuration="20.725636637s" podCreationTimestamp="2024-07-31 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.724822579 +0000 UTC m=+33.396938994" watchObservedRunningTime="2024-07-31 16:54:25.725636637 +0000 UTC m=+33.397753046"
	Jul 31 16:54:25 ha-393000 kubelet[2209]: I0731 16:54:25.753514    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/storage-provisioner" podStartSLOduration=19.753503033 podStartE2EDuration="19.753503033s" podCreationTimestamp="2024-07-31 16:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.752974741 +0000 UTC m=+33.425091155" watchObservedRunningTime="2024-07-31 16:54:25.753503033 +0000 UTC m=+33.425619443"
	Jul 31 16:54:52 ha-393000 kubelet[2209]: E0731 16:54:52.468990    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:54:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:54:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:55:52 ha-393000 kubelet[2209]: E0731 16:55:52.468170    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:55:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.510532    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-wvqjl" podStartSLOduration=162.510247367 podStartE2EDuration="2m42.510247367s" podCreationTimestamp="2024-07-31 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.761498183 +0000 UTC m=+33.433614594" watchObservedRunningTime="2024-07-31 16:56:47.510247367 +0000 UTC m=+175.182363776"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.510944    2209 topology_manager.go:215] "Topology Admit Handler" podUID="dd382c29-63af-44cb-bf5b-b7db27f11017" podNamespace="default" podName="busybox-fc5497c4f-b94zr"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.640155    2209 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8k4\" (UniqueName: \"kubernetes.io/projected/dd382c29-63af-44cb-bf5b-b7db27f11017-kube-api-access-cp8k4\") pod \"busybox-fc5497c4f-b94zr\" (UID: \"dd382c29-63af-44cb-bf5b-b7db27f11017\") " pod="default/busybox-fc5497c4f-b94zr"
	Jul 31 16:56:52 ha-393000 kubelet[2209]: E0731 16:56:52.472632    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:56:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:57:52 ha-393000 kubelet[2209]: E0731 16:57:52.468077    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:57:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:261: (dbg) Run:  kubectl --context ha-393000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/StopSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/StopSecondaryNode (11.78s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (99.36s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:420: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 node start m02 -v=7 --alsologtostderr
ha_test.go:420: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 node start m02 -v=7 --alsologtostderr: (41.027161063s)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (454.945855ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:59:10.674375    3214 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:59:10.674565    3214 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:10.674570    3214 out.go:304] Setting ErrFile to fd 2...
	I0731 09:59:10.674574    3214 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:10.674753    3214 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:59:10.674938    3214 out.go:298] Setting JSON to false
	I0731 09:59:10.674960    3214 mustload.go:65] Loading cluster: ha-393000
	I0731 09:59:10.675000    3214 notify.go:220] Checking for updates...
	I0731 09:59:10.675266    3214 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:59:10.675283    3214 status.go:255] checking status of ha-393000 ...
	I0731 09:59:10.675657    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.675718    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.684836    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51460
	I0731 09:59:10.685173    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.685576    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.685602    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.685793    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.685892    3214 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:59:10.685975    3214 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:10.686049    3214 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:59:10.687127    3214 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:59:10.687148    3214 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:10.687405    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.687425    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.695879    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51462
	I0731 09:59:10.696201    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.696537    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.696552    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.696791    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.696911    3214 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:59:10.696989    3214 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:10.697252    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.697278    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.705828    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51464
	I0731 09:59:10.706147    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.706488    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.706503    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.706733    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.706846    3214 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:59:10.706981    3214 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:10.707003    3214 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:59:10.707084    3214 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:59:10.707161    3214 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:59:10.707249    3214 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:59:10.707342    3214 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:59:10.742729    3214 ssh_runner.go:195] Run: systemctl --version
	I0731 09:59:10.747047    3214 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:10.757781    3214 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:10.757805    3214 api_server.go:166] Checking apiserver status ...
	I0731 09:59:10.757846    3214 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:10.769244    3214 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:59:10.776804    3214 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:10.776856    3214 ssh_runner.go:195] Run: ls
	I0731 09:59:10.780242    3214 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:10.784686    3214 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:10.784700    3214 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:59:10.784709    3214 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:10.784721    3214 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:59:10.784978    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.785000    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.793945    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51468
	I0731 09:59:10.794268    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.794636    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.794653    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.794833    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.794947    3214 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:59:10.795032    3214 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:10.795102    3214 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 09:59:10.796102    3214 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:59:10.796111    3214 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:10.796358    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.796387    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.805085    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51470
	I0731 09:59:10.805462    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.805792    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.805803    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.806031    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.806199    3214 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:59:10.806292    3214 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:10.806580    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.806605    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.815329    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51472
	I0731 09:59:10.815653    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.816009    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.816026    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.816230    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.816335    3214 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:59:10.816458    3214 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:10.816470    3214 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:59:10.816546    3214 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:59:10.816619    3214 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:59:10.816698    3214 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:59:10.816779    3214 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:59:10.851687    3214 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:10.863191    3214 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:10.863205    3214 api_server.go:166] Checking apiserver status ...
	I0731 09:59:10.863241    3214 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:10.874906    3214 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 09:59:10.883198    3214 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:10.883262    3214 ssh_runner.go:195] Run: ls
	I0731 09:59:10.886767    3214 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:10.889810    3214 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:10.889822    3214 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:59:10.889830    3214 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:10.889841    3214 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:59:10.890112    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.890132    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.898624    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51476
	I0731 09:59:10.898970    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.899355    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.899372    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.899607    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.899723    3214 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:59:10.899821    3214 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:10.899921    3214 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:59:10.900951    3214 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:59:10.900959    3214 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:10.901241    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.901268    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.910641    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51478
	I0731 09:59:10.910971    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.911326    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.911337    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.911561    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.911672    3214 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:59:10.911760    3214 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:10.912029    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.912058    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:10.920821    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51480
	I0731 09:59:10.921169    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:10.921569    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:10.921590    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:10.921831    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:10.921950    3214 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:59:10.922100    3214 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:10.922113    3214 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:59:10.922214    3214 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:59:10.922294    3214 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:59:10.922390    3214 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:59:10.922471    3214 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:59:10.956684    3214 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:10.968474    3214 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:10.968488    3214 api_server.go:166] Checking apiserver status ...
	I0731 09:59:10.968524    3214 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:10.981058    3214 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:59:10.989695    3214 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:10.989747    3214 ssh_runner.go:195] Run: ls
	I0731 09:59:10.992959    3214 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:10.996051    3214 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:10.996066    3214 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:59:10.996076    3214 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:10.996086    3214 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:59:10.996400    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:10.996430    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:11.005149    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51484
	I0731 09:59:11.005485    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:11.005801    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:11.005811    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:11.006035    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:11.006157    3214 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:59:11.006249    3214 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:11.006332    3214 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:59:11.007353    3214 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:59:11.007365    3214 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:11.007635    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:11.007659    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:11.016461    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51486
	I0731 09:59:11.016807    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:11.017119    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:11.017129    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:11.017355    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:11.017465    3214 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:59:11.017556    3214 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:11.017824    3214 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:11.017856    3214 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:11.026463    3214 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51488
	I0731 09:59:11.026802    3214 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:11.027129    3214 main.go:141] libmachine: Using API Version  1
	I0731 09:59:11.027139    3214 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:11.027357    3214 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:11.027473    3214 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:59:11.027608    3214 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:11.027620    3214 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:59:11.027696    3214 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:59:11.027801    3214 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:59:11.027880    3214 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:59:11.027952    3214 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:59:11.061219    3214 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:11.073299    3214 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (446.848278ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:59:12.461857    3228 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:59:12.462048    3228 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:12.462053    3228 out.go:304] Setting ErrFile to fd 2...
	I0731 09:59:12.462057    3228 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:12.462232    3228 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:59:12.462420    3228 out.go:298] Setting JSON to false
	I0731 09:59:12.462442    3228 mustload.go:65] Loading cluster: ha-393000
	I0731 09:59:12.462481    3228 notify.go:220] Checking for updates...
	I0731 09:59:12.462743    3228 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:59:12.462760    3228 status.go:255] checking status of ha-393000 ...
	I0731 09:59:12.463107    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.463167    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.471889    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51492
	I0731 09:59:12.472214    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.472680    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.472692    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.472905    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.473023    3228 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:59:12.473122    3228 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:12.473189    3228 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:59:12.474178    3228 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:59:12.474198    3228 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:12.474433    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.474462    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.482791    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51494
	I0731 09:59:12.483131    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.483449    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.483472    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.483748    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.483876    3228 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:59:12.483958    3228 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:12.484205    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.484238    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.493253    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51496
	I0731 09:59:12.493616    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.493934    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.493943    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.494173    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.494274    3228 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:59:12.494417    3228 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:12.494435    3228 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:59:12.494504    3228 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:59:12.494614    3228 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:59:12.494693    3228 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:59:12.494778    3228 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:59:12.530185    3228 ssh_runner.go:195] Run: systemctl --version
	I0731 09:59:12.534393    3228 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:12.546386    3228 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:12.546411    3228 api_server.go:166] Checking apiserver status ...
	I0731 09:59:12.546451    3228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:12.558085    3228 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:59:12.566171    3228 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:12.566219    3228 ssh_runner.go:195] Run: ls
	I0731 09:59:12.569317    3228 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:12.572382    3228 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:12.572394    3228 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:59:12.572404    3228 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:12.572422    3228 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:59:12.572697    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.572721    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.581339    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51500
	I0731 09:59:12.581688    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.582023    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.582036    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.582250    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.582363    3228 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:59:12.582453    3228 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:12.582533    3228 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 09:59:12.583543    3228 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:59:12.583553    3228 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:12.583824    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.583859    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.593052    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51502
	I0731 09:59:12.593407    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.593782    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.593800    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.594006    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.594114    3228 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:59:12.594198    3228 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:12.594456    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.594480    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.602892    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51504
	I0731 09:59:12.603232    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.603554    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.603565    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.603749    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.603883    3228 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:59:12.604009    3228 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:12.604019    3228 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:59:12.604093    3228 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:59:12.604176    3228 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:59:12.604252    3228 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:59:12.604350    3228 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:59:12.638861    3228 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:12.650327    3228 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:12.650342    3228 api_server.go:166] Checking apiserver status ...
	I0731 09:59:12.650385    3228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:12.661830    3228 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 09:59:12.669945    3228 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:12.670002    3228 ssh_runner.go:195] Run: ls
	I0731 09:59:12.673442    3228 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:12.676602    3228 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:12.676614    3228 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:59:12.676622    3228 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:12.676632    3228 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:59:12.676879    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.676899    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.685561    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51508
	I0731 09:59:12.685958    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.686295    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.686307    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.686527    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.686637    3228 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:59:12.686713    3228 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:12.686790    3228 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:59:12.687786    3228 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:59:12.687797    3228 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:12.688067    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.688092    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.696541    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51510
	I0731 09:59:12.696881    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.697200    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.697219    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.697411    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.697508    3228 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:59:12.697582    3228 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:12.697827    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.697851    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.706219    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51512
	I0731 09:59:12.706548    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.706864    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.706874    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.707082    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.707192    3228 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:59:12.707323    3228 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:12.707335    3228 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:59:12.707409    3228 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:59:12.707477    3228 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:59:12.707549    3228 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:59:12.707626    3228 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:59:12.742226    3228 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:12.753602    3228 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:12.753616    3228 api_server.go:166] Checking apiserver status ...
	I0731 09:59:12.753651    3228 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:12.765150    3228 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:59:12.773044    3228 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:12.773088    3228 ssh_runner.go:195] Run: ls
	I0731 09:59:12.776286    3228 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:12.779337    3228 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:12.779347    3228 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:59:12.779355    3228 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:12.779365    3228 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:59:12.779630    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.779654    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.788332    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51516
	I0731 09:59:12.788682    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.788989    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.789000    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.789219    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.789333    3228 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:59:12.789418    3228 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:12.789494    3228 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:59:12.790515    3228 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:59:12.790533    3228 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:12.790782    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.790807    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.799179    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51518
	I0731 09:59:12.799524    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.799829    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.799842    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.800057    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.800162    3228 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:59:12.800257    3228 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:12.800508    3228 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:12.800529    3228 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:12.809067    3228 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51520
	I0731 09:59:12.809423    3228 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:12.809748    3228 main.go:141] libmachine: Using API Version  1
	I0731 09:59:12.809763    3228 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:12.809947    3228 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:12.810049    3228 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:59:12.810182    3228 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:12.810193    3228 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:59:12.810264    3228 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:59:12.810350    3228 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:59:12.810430    3228 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:59:12.810506    3228 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:59:12.842595    3228 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:12.852862    3228 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (460.462023ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:59:14.872902    3242 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:59:14.873101    3242 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:14.873107    3242 out.go:304] Setting ErrFile to fd 2...
	I0731 09:59:14.873111    3242 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:14.873288    3242 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:59:14.873468    3242 out.go:298] Setting JSON to false
	I0731 09:59:14.873486    3242 mustload.go:65] Loading cluster: ha-393000
	I0731 09:59:14.873527    3242 notify.go:220] Checking for updates...
	I0731 09:59:14.873788    3242 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:59:14.873806    3242 status.go:255] checking status of ha-393000 ...
	I0731 09:59:14.874180    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:14.874240    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:14.882896    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51524
	I0731 09:59:14.883269    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:14.883697    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:14.883706    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:14.883947    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:14.884061    3242 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:59:14.884146    3242 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:14.884213    3242 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:59:14.885334    3242 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:59:14.885352    3242 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:14.885597    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:14.885618    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:14.894158    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51526
	I0731 09:59:14.894507    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:14.894857    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:14.894875    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:14.895076    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:14.895182    3242 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:59:14.895258    3242 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:14.895508    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:14.895535    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:14.903897    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51528
	I0731 09:59:14.904198    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:14.904539    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:14.904554    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:14.904746    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:14.904852    3242 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:59:14.904983    3242 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:14.905003    3242 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:59:14.905068    3242 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:59:14.905164    3242 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:59:14.905234    3242 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:59:14.905316    3242 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:59:14.941261    3242 ssh_runner.go:195] Run: systemctl --version
	I0731 09:59:14.945572    3242 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:14.959250    3242 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:14.959275    3242 api_server.go:166] Checking apiserver status ...
	I0731 09:59:14.959320    3242 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:14.971277    3242 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:59:14.980021    3242 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:14.980080    3242 ssh_runner.go:195] Run: ls
	I0731 09:59:14.983229    3242 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:14.987565    3242 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:14.987580    3242 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:59:14.987589    3242 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:14.987608    3242 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:59:14.987870    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:14.987891    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:14.996517    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51532
	I0731 09:59:14.996853    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:14.997196    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:14.997207    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:14.997424    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:14.997550    3242 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:59:14.997634    3242 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:14.997704    3242 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 09:59:14.998757    3242 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:59:14.998765    3242 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:14.999020    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:14.999043    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:15.007591    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51534
	I0731 09:59:15.007923    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:15.008265    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:15.008277    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:15.008490    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:15.008603    3242 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:59:15.008701    3242 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:15.008962    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:15.008996    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:15.017723    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51536
	I0731 09:59:15.018078    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:15.018403    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:15.018414    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:15.018637    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:15.018747    3242 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:59:15.018891    3242 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:15.018904    3242 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:59:15.018981    3242 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:59:15.019067    3242 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:59:15.019155    3242 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:59:15.019227    3242 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:59:15.054476    3242 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:15.066237    3242 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:15.066252    3242 api_server.go:166] Checking apiserver status ...
	I0731 09:59:15.066288    3242 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:15.078278    3242 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 09:59:15.091741    3242 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:15.091798    3242 ssh_runner.go:195] Run: ls
	I0731 09:59:15.095102    3242 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:15.098156    3242 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:15.098170    3242 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:59:15.098179    3242 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:15.098189    3242 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:59:15.098442    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:15.098464    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:15.107102    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51540
	I0731 09:59:15.107438    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:15.107816    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:15.107838    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:15.108042    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:15.108143    3242 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:59:15.108228    3242 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:15.108302    3242 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:59:15.109317    3242 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:59:15.109326    3242 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:15.109574    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:15.109603    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:15.117903    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51542
	I0731 09:59:15.118249    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:15.118596    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:15.118607    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:15.118802    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:15.118912    3242 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:59:15.118987    3242 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:15.119259    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:15.119283    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:15.127708    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51544
	I0731 09:59:15.128042    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:15.128347    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:15.128358    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:15.128564    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:15.128671    3242 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:59:15.128804    3242 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:15.128824    3242 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:59:15.128902    3242 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:59:15.128987    3242 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:59:15.129072    3242 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:59:15.129149    3242 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:59:15.163533    3242 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:15.177260    3242 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:15.177280    3242 api_server.go:166] Checking apiserver status ...
	I0731 09:59:15.177326    3242 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:15.189365    3242 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:59:15.197646    3242 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:15.197701    3242 ssh_runner.go:195] Run: ls
	I0731 09:59:15.200854    3242 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:15.203968    3242 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:15.203980    3242 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:59:15.203989    3242 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:15.204002    3242 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:59:15.204250    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:15.204281    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:15.213348    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51548
	I0731 09:59:15.213720    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:15.214117    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:15.214134    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:15.214369    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:15.214473    3242 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:59:15.214563    3242 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:15.214653    3242 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:59:15.215689    3242 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:59:15.215698    3242 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:15.215966    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:15.215992    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:15.224571    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51550
	I0731 09:59:15.224908    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:15.225244    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:15.225262    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:15.225476    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:15.225584    3242 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:59:15.225670    3242 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:15.225928    3242 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:15.225950    3242 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:15.234424    3242 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51552
	I0731 09:59:15.234770    3242 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:15.235138    3242 main.go:141] libmachine: Using API Version  1
	I0731 09:59:15.235152    3242 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:15.235403    3242 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:15.235531    3242 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:59:15.235679    3242 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:15.235691    3242 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:59:15.235794    3242 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:59:15.235875    3242 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:59:15.235963    3242 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:59:15.236039    3242 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:59:15.268054    3242 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:15.278456    3242 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (454.818782ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:59:18.100812    3256 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:59:18.100993    3256 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:18.100999    3256 out.go:304] Setting ErrFile to fd 2...
	I0731 09:59:18.101002    3256 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:18.101199    3256 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:59:18.101378    3256 out.go:298] Setting JSON to false
	I0731 09:59:18.101402    3256 mustload.go:65] Loading cluster: ha-393000
	I0731 09:59:18.101444    3256 notify.go:220] Checking for updates...
	I0731 09:59:18.101711    3256 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:59:18.101729    3256 status.go:255] checking status of ha-393000 ...
	I0731 09:59:18.102093    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.102136    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.111049    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51556
	I0731 09:59:18.111440    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.111875    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.111885    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.112078    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.112178    3256 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:59:18.112265    3256 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:18.112335    3256 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:59:18.113373    3256 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:59:18.113394    3256 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:18.113628    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.113647    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.122304    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51558
	I0731 09:59:18.122654    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.122980    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.122990    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.123181    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.123290    3256 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:59:18.123377    3256 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:18.123645    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.123676    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.133107    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51560
	I0731 09:59:18.133486    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.133842    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.133854    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.134077    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.134199    3256 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:59:18.134359    3256 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:18.134376    3256 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:59:18.134470    3256 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:59:18.134545    3256 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:59:18.134613    3256 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:59:18.134709    3256 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:59:18.172405    3256 ssh_runner.go:195] Run: systemctl --version
	I0731 09:59:18.177391    3256 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:18.189495    3256 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:18.189520    3256 api_server.go:166] Checking apiserver status ...
	I0731 09:59:18.189569    3256 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:18.201894    3256 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:59:18.210700    3256 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:18.210769    3256 ssh_runner.go:195] Run: ls
	I0731 09:59:18.213943    3256 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:18.218080    3256 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:18.218093    3256 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:59:18.218103    3256 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:18.218114    3256 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:59:18.218369    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.218395    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.227089    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51564
	I0731 09:59:18.227446    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.227758    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.227770    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.227952    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.228058    3256 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:59:18.228149    3256 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:18.228221    3256 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 09:59:18.229280    3256 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:59:18.229292    3256 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:18.229557    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.229580    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.238169    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51566
	I0731 09:59:18.238519    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.238852    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.238870    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.239086    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.239216    3256 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:59:18.239318    3256 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:18.239583    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.239618    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.248145    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51568
	I0731 09:59:18.248487    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.248810    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.248820    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.249028    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.249155    3256 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:59:18.249297    3256 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:18.249309    3256 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:59:18.249391    3256 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:59:18.249474    3256 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:59:18.249564    3256 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:59:18.249650    3256 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:59:18.284301    3256 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:18.295951    3256 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:18.295965    3256 api_server.go:166] Checking apiserver status ...
	I0731 09:59:18.296003    3256 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:18.308427    3256 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 09:59:18.317112    3256 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:18.317164    3256 ssh_runner.go:195] Run: ls
	I0731 09:59:18.320414    3256 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:18.323498    3256 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:18.323509    3256 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:59:18.323518    3256 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:18.323536    3256 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:59:18.323793    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.323812    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.332416    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51572
	I0731 09:59:18.332761    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.333116    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.333135    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.333373    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.333528    3256 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:59:18.333625    3256 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:18.333691    3256 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:59:18.334803    3256 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:59:18.334817    3256 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:18.335117    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.335149    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.343683    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51574
	I0731 09:59:18.344026    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.344374    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.344394    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.344618    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.344740    3256 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:59:18.344813    3256 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:18.345086    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.345109    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.353770    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51576
	I0731 09:59:18.354148    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.354503    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.354522    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.354739    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.354857    3256 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:59:18.355013    3256 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:18.355025    3256 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:59:18.355109    3256 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:59:18.355186    3256 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:59:18.355312    3256 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:59:18.355394    3256 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:59:18.389406    3256 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:18.400867    3256 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:18.400881    3256 api_server.go:166] Checking apiserver status ...
	I0731 09:59:18.400918    3256 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:18.412293    3256 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:59:18.420449    3256 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:18.420497    3256 ssh_runner.go:195] Run: ls
	I0731 09:59:18.423768    3256 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:18.426779    3256 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:18.426790    3256 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:59:18.426799    3256 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:18.426811    3256 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:59:18.427061    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.427080    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.435684    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51580
	I0731 09:59:18.436029    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.436401    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.436415    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.436633    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.436745    3256 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:59:18.436824    3256 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:18.436903    3256 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:59:18.437966    3256 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:59:18.437976    3256 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:18.438235    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.438262    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.446776    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51582
	I0731 09:59:18.447111    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.447457    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.447471    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.447697    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.447804    3256 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:59:18.447888    3256 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:18.448153    3256 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:18.448175    3256 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:18.456565    3256 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51584
	I0731 09:59:18.456922    3256 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:18.457250    3256 main.go:141] libmachine: Using API Version  1
	I0731 09:59:18.457260    3256 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:18.457466    3256 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:18.457580    3256 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:59:18.457713    3256 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:18.457724    3256 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:59:18.457794    3256 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:59:18.457878    3256 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:59:18.457959    3256 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:59:18.458043    3256 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:59:18.489827    3256 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:18.499948    3256 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (461.728514ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:59:20.306971    3270 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:59:20.307246    3270 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:20.307252    3270 out.go:304] Setting ErrFile to fd 2...
	I0731 09:59:20.307255    3270 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:20.307423    3270 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:59:20.307598    3270 out.go:298] Setting JSON to false
	I0731 09:59:20.307619    3270 mustload.go:65] Loading cluster: ha-393000
	I0731 09:59:20.307652    3270 notify.go:220] Checking for updates...
	I0731 09:59:20.307899    3270 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:59:20.307915    3270 status.go:255] checking status of ha-393000 ...
	I0731 09:59:20.308286    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.308334    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.317227    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51588
	I0731 09:59:20.317684    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.318094    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.318107    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.318349    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.318469    3270 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:59:20.318556    3270 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:20.318628    3270 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:59:20.319728    3270 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:59:20.319747    3270 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:20.319985    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.320006    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.328295    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51590
	I0731 09:59:20.328629    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.328997    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.329013    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.329218    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.329322    3270 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:59:20.329406    3270 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:20.329654    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.329681    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.338204    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51592
	I0731 09:59:20.338526    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.338852    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.338865    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.339083    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.339195    3270 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:59:20.339334    3270 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:20.339360    3270 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:59:20.339430    3270 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:59:20.339536    3270 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:59:20.339615    3270 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:59:20.339682    3270 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:59:20.376093    3270 ssh_runner.go:195] Run: systemctl --version
	I0731 09:59:20.380528    3270 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:20.392759    3270 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:20.392783    3270 api_server.go:166] Checking apiserver status ...
	I0731 09:59:20.392828    3270 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:20.404767    3270 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:59:20.413327    3270 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:20.413387    3270 ssh_runner.go:195] Run: ls
	I0731 09:59:20.416567    3270 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:20.419683    3270 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:20.419694    3270 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:59:20.419705    3270 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:20.419718    3270 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:59:20.419961    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.419981    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.428825    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51596
	I0731 09:59:20.429167    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.429483    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.429492    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.429725    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.429842    3270 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:59:20.429927    3270 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:20.430000    3270 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 09:59:20.431083    3270 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:59:20.431091    3270 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:20.431340    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.431366    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.439920    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51598
	I0731 09:59:20.440240    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.440549    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.440558    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.440759    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.440871    3270 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:59:20.440966    3270 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:20.441228    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.441249    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.449634    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51600
	I0731 09:59:20.449984    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.450309    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.450318    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.450537    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.450650    3270 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:59:20.450793    3270 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:20.450805    3270 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:59:20.450882    3270 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:59:20.450971    3270 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:59:20.451059    3270 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:59:20.451138    3270 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:59:20.487393    3270 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:20.502855    3270 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:20.502872    3270 api_server.go:166] Checking apiserver status ...
	I0731 09:59:20.502913    3270 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:20.515189    3270 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 09:59:20.523765    3270 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:20.523813    3270 ssh_runner.go:195] Run: ls
	I0731 09:59:20.526999    3270 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:20.530136    3270 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:20.530147    3270 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:59:20.530155    3270 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:20.530165    3270 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:59:20.530431    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.530451    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.539288    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51604
	I0731 09:59:20.539641    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.539974    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.539985    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.540185    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.540289    3270 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:59:20.540360    3270 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:20.540435    3270 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:59:20.541499    3270 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:59:20.541508    3270 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:20.541771    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.541807    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.550294    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51606
	I0731 09:59:20.550642    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.550980    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.550994    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.551209    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.551317    3270 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:59:20.551406    3270 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:20.551668    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.551692    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.560001    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51608
	I0731 09:59:20.560336    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.560688    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.560706    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.560908    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.561018    3270 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:59:20.561135    3270 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:20.561146    3270 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:59:20.561233    3270 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:59:20.561324    3270 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:59:20.561402    3270 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:59:20.561482    3270 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:59:20.595648    3270 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:20.607003    3270 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:20.607018    3270 api_server.go:166] Checking apiserver status ...
	I0731 09:59:20.607060    3270 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:20.621686    3270 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:59:20.629988    3270 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:20.630046    3270 ssh_runner.go:195] Run: ls
	I0731 09:59:20.633366    3270 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:20.636472    3270 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:20.636483    3270 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:59:20.636492    3270 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:20.636502    3270 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:59:20.636753    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.636773    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.645274    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51612
	I0731 09:59:20.645610    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.645953    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.645965    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.646185    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.646302    3270 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:59:20.646391    3270 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:20.646463    3270 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:59:20.647516    3270 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:59:20.647523    3270 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:20.647773    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.647796    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.656255    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51614
	I0731 09:59:20.656609    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.656983    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.657000    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.657215    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.657322    3270 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:59:20.657403    3270 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:20.657661    3270 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:20.657682    3270 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:20.666453    3270 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51616
	I0731 09:59:20.666939    3270 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:20.667370    3270 main.go:141] libmachine: Using API Version  1
	I0731 09:59:20.667381    3270 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:20.667610    3270 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:20.667729    3270 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:59:20.667891    3270 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:20.667904    3270 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:59:20.668023    3270 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:59:20.668123    3270 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:59:20.668220    3270 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:59:20.668328    3270 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:59:20.700702    3270 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:20.711082    3270 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (452.807244ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:59:24.645852    3284 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:59:24.646026    3284 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:24.646032    3284 out.go:304] Setting ErrFile to fd 2...
	I0731 09:59:24.646036    3284 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:24.646209    3284 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:59:24.646393    3284 out.go:298] Setting JSON to false
	I0731 09:59:24.646418    3284 mustload.go:65] Loading cluster: ha-393000
	I0731 09:59:24.646451    3284 notify.go:220] Checking for updates...
	I0731 09:59:24.646720    3284 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:59:24.646736    3284 status.go:255] checking status of ha-393000 ...
	I0731 09:59:24.647121    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.647156    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.655861    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51620
	I0731 09:59:24.656255    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.656683    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.656691    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.656888    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.656992    3284 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:59:24.657085    3284 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:24.657153    3284 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:59:24.658187    3284 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:59:24.658203    3284 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:24.658433    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.658454    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.666805    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51622
	I0731 09:59:24.667179    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.667504    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.667517    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.667711    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.667821    3284 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:59:24.667905    3284 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:24.668168    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.668190    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.677414    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51624
	I0731 09:59:24.677759    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.678108    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.678126    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.678328    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.678442    3284 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:59:24.678588    3284 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:24.678607    3284 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:59:24.678683    3284 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:59:24.678756    3284 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:59:24.678853    3284 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:59:24.678926    3284 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:59:24.713909    3284 ssh_runner.go:195] Run: systemctl --version
	I0731 09:59:24.718530    3284 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:24.731124    3284 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:24.731148    3284 api_server.go:166] Checking apiserver status ...
	I0731 09:59:24.731196    3284 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:24.743187    3284 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:59:24.751255    3284 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:24.751308    3284 ssh_runner.go:195] Run: ls
	I0731 09:59:24.754474    3284 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:24.757656    3284 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:24.757666    3284 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:59:24.757676    3284 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:24.757686    3284 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:59:24.757935    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.757954    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.766711    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51628
	I0731 09:59:24.767055    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.767376    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.767386    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.767583    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.767700    3284 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:59:24.767785    3284 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:24.767851    3284 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 09:59:24.768927    3284 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:59:24.768935    3284 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:24.769177    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.769212    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.777880    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51630
	I0731 09:59:24.778230    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.778557    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.778568    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.778802    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.778919    3284 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:59:24.779008    3284 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:24.779269    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.779293    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.788145    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51632
	I0731 09:59:24.788509    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.788856    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.788871    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.789068    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.789177    3284 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:59:24.789310    3284 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:24.789322    3284 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:59:24.789403    3284 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:59:24.789481    3284 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:59:24.789574    3284 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:59:24.789651    3284 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:59:24.825855    3284 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:24.838039    3284 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:24.838053    3284 api_server.go:166] Checking apiserver status ...
	I0731 09:59:24.838097    3284 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:24.849956    3284 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 09:59:24.858238    3284 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:24.858292    3284 ssh_runner.go:195] Run: ls
	I0731 09:59:24.861597    3284 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:24.865683    3284 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:24.865696    3284 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:59:24.865705    3284 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:24.865716    3284 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:59:24.865972    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.865993    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.874880    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51636
	I0731 09:59:24.875229    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.875546    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.875557    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.875768    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.875890    3284 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:59:24.875972    3284 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:24.876045    3284 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:59:24.877110    3284 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:59:24.877121    3284 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:24.877385    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.877408    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.885921    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51638
	I0731 09:59:24.886275    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.886610    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.886622    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.886842    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.886956    3284 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:59:24.887041    3284 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:24.887315    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.887340    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.895987    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51640
	I0731 09:59:24.896333    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.896655    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.896668    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.896893    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.896989    3284 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:59:24.897123    3284 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:24.897134    3284 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:59:24.897215    3284 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:59:24.897284    3284 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:59:24.897362    3284 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:59:24.897444    3284 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:59:24.931665    3284 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:24.943018    3284 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:24.943035    3284 api_server.go:166] Checking apiserver status ...
	I0731 09:59:24.943078    3284 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:24.954534    3284 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:59:24.962325    3284 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:24.962384    3284 ssh_runner.go:195] Run: ls
	I0731 09:59:24.965617    3284 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:24.968730    3284 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:24.968744    3284 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:59:24.968754    3284 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:24.968765    3284 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:59:24.969030    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.969053    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.977989    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51644
	I0731 09:59:24.978340    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.978688    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.978704    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.978907    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.979004    3284 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:59:24.979083    3284 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:24.979175    3284 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:59:24.980280    3284 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:59:24.980296    3284 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:24.980581    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.980605    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.989509    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51646
	I0731 09:59:24.989908    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:24.990465    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:24.990479    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:24.990806    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:24.990937    3284 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:59:24.991040    3284 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:24.991321    3284 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:24.991352    3284 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:24.999812    3284 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51648
	I0731 09:59:25.000152    3284 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:25.000501    3284 main.go:141] libmachine: Using API Version  1
	I0731 09:59:25.000517    3284 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:25.000717    3284 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:25.000822    3284 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:59:25.000960    3284 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:25.000972    3284 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:59:25.001054    3284 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:59:25.001131    3284 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:59:25.001222    3284 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:59:25.001299    3284 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:59:25.033357    3284 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:25.043442    3284 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (463.581868ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:59:34.662573    3300 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:59:34.662857    3300 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:34.662862    3300 out.go:304] Setting ErrFile to fd 2...
	I0731 09:59:34.662866    3300 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:34.663064    3300 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:59:34.663245    3300 out.go:298] Setting JSON to false
	I0731 09:59:34.663272    3300 mustload.go:65] Loading cluster: ha-393000
	I0731 09:59:34.663312    3300 notify.go:220] Checking for updates...
	I0731 09:59:34.663599    3300 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:59:34.663615    3300 status.go:255] checking status of ha-393000 ...
	I0731 09:59:34.664017    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.664060    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.672984    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51652
	I0731 09:59:34.673326    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.673765    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.673775    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.673983    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.674091    3300 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:59:34.674174    3300 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:34.674260    3300 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:59:34.675280    3300 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:59:34.675302    3300 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:34.675602    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.675624    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.684100    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51654
	I0731 09:59:34.684463    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.684793    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.684829    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.685074    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.685195    3300 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:59:34.685287    3300 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:34.685545    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.685576    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.694542    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51656
	I0731 09:59:34.694851    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.695164    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.695174    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.695417    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.695532    3300 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:59:34.695687    3300 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:34.695706    3300 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:59:34.695793    3300 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:59:34.695873    3300 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:59:34.695954    3300 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:59:34.696037    3300 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:59:34.731650    3300 ssh_runner.go:195] Run: systemctl --version
	I0731 09:59:34.735960    3300 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:34.750173    3300 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:34.750199    3300 api_server.go:166] Checking apiserver status ...
	I0731 09:59:34.750239    3300 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:34.763180    3300 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:59:34.773271    3300 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:34.773333    3300 ssh_runner.go:195] Run: ls
	I0731 09:59:34.777034    3300 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:34.780176    3300 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:34.780188    3300 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:59:34.780198    3300 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:34.780208    3300 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:59:34.780458    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.780482    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.789119    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51660
	I0731 09:59:34.789467    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.789793    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.789804    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.790015    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.790117    3300 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:59:34.790190    3300 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:34.790267    3300 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 09:59:34.791260    3300 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:59:34.791271    3300 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:34.791519    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.791556    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.800296    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51662
	I0731 09:59:34.800663    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.800981    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.800991    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.801193    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.801293    3300 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:59:34.801380    3300 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:34.801633    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.801654    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.810201    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51664
	I0731 09:59:34.810605    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.810992    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.811011    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.811214    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.811309    3300 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:59:34.811439    3300 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:34.811451    3300 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:59:34.811532    3300 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:59:34.811612    3300 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:59:34.811689    3300 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:59:34.811758    3300 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:59:34.848853    3300 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:34.860999    3300 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:34.861014    3300 api_server.go:166] Checking apiserver status ...
	I0731 09:59:34.861052    3300 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:34.873318    3300 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 09:59:34.882110    3300 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:34.882162    3300 ssh_runner.go:195] Run: ls
	I0731 09:59:34.885295    3300 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:34.889482    3300 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:34.889497    3300 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:59:34.889506    3300 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:34.889517    3300 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:59:34.889804    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.889825    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.898492    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51668
	I0731 09:59:34.898843    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.899141    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.899151    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.899384    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.899510    3300 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:59:34.899601    3300 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:34.899672    3300 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:59:34.900669    3300 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:59:34.900679    3300 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:34.900946    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.900978    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.909502    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51670
	I0731 09:59:34.909831    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.910210    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.910229    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.910427    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.910524    3300 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:59:34.910600    3300 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:34.910861    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.910883    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:34.919488    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51672
	I0731 09:59:34.919850    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:34.920204    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:34.920224    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:34.920438    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:34.920556    3300 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:59:34.920694    3300 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:34.920706    3300 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:59:34.920785    3300 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:59:34.920855    3300 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:59:34.920932    3300 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:59:34.921000    3300 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:59:34.955347    3300 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:34.967357    3300 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:34.967371    3300 api_server.go:166] Checking apiserver status ...
	I0731 09:59:34.967407    3300 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:34.979548    3300 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:59:34.987422    3300 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:34.987482    3300 ssh_runner.go:195] Run: ls
	I0731 09:59:34.990538    3300 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:34.993593    3300 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:34.993605    3300 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:59:34.993613    3300 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:34.993625    3300 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:59:34.993878    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:34.993907    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:35.002495    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51676
	I0731 09:59:35.002854    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:35.003187    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:35.003201    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:35.003418    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:35.003542    3300 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:59:35.003628    3300 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:35.003713    3300 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:59:35.004704    3300 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:59:35.004714    3300 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:35.004985    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:35.005010    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:35.013621    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51678
	I0731 09:59:35.013962    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:35.014296    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:35.014307    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:35.014514    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:35.014636    3300 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:59:35.014727    3300 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:35.014967    3300 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:35.014997    3300 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:35.023473    3300 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51680
	I0731 09:59:35.023835    3300 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:35.024164    3300 main.go:141] libmachine: Using API Version  1
	I0731 09:59:35.024189    3300 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:35.024390    3300 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:35.024504    3300 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:59:35.024653    3300 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:35.024664    3300 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:59:35.024754    3300 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:59:35.024840    3300 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:59:35.024924    3300 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:59:35.025007    3300 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:59:35.057524    3300 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:35.068813    3300 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
E0731 09:59:38.352731    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (448.813345ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:59:41.694169    3314 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:59:41.694435    3314 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:41.694440    3314 out.go:304] Setting ErrFile to fd 2...
	I0731 09:59:41.694444    3314 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:59:41.694618    3314 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:59:41.694817    3314 out.go:298] Setting JSON to false
	I0731 09:59:41.694840    3314 mustload.go:65] Loading cluster: ha-393000
	I0731 09:59:41.694873    3314 notify.go:220] Checking for updates...
	I0731 09:59:41.695159    3314 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:59:41.695175    3314 status.go:255] checking status of ha-393000 ...
	I0731 09:59:41.695522    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.695570    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.704462    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51684
	I0731 09:59:41.704875    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.705277    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.705294    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.705481    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.705581    3314 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:59:41.705670    3314 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:41.705743    3314 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:59:41.706737    3314 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 09:59:41.706757    3314 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:41.706993    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.707012    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.715425    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51686
	I0731 09:59:41.715767    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.716103    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.716114    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.716340    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.716458    3314 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:59:41.716541    3314 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:59:41.716796    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.716819    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.725319    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51688
	I0731 09:59:41.725637    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.725990    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.726007    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.726196    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.726308    3314 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:59:41.726451    3314 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:41.726469    3314 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:59:41.726540    3314 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:59:41.726614    3314 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:59:41.726695    3314 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:59:41.726778    3314 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:59:41.761996    3314 ssh_runner.go:195] Run: systemctl --version
	I0731 09:59:41.766517    3314 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:41.777460    3314 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:41.777486    3314 api_server.go:166] Checking apiserver status ...
	I0731 09:59:41.777527    3314 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:41.790535    3314 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 09:59:41.798957    3314 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:41.799017    3314 ssh_runner.go:195] Run: ls
	I0731 09:59:41.802101    3314 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:41.805300    3314 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:41.805311    3314 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 09:59:41.805321    3314 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:41.805332    3314 status.go:255] checking status of ha-393000-m02 ...
	I0731 09:59:41.805616    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.805636    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.814397    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51692
	I0731 09:59:41.814756    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.815076    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.815090    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.815323    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.815439    3314 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:59:41.815522    3314 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:41.815607    3314 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 09:59:41.816596    3314 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 09:59:41.816607    3314 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:41.816875    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.816898    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.825447    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51694
	I0731 09:59:41.825798    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.826118    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.826127    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.826353    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.826462    3314 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:59:41.826553    3314 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 09:59:41.826816    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.826843    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.835313    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51696
	I0731 09:59:41.835653    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.835968    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.836001    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.836224    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.836332    3314 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:59:41.836461    3314 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:41.836473    3314 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:59:41.836555    3314 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:59:41.836636    3314 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:59:41.836719    3314 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:59:41.836793    3314 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:59:41.873130    3314 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:41.884075    3314 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:41.884091    3314 api_server.go:166] Checking apiserver status ...
	I0731 09:59:41.884128    3314 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:41.896151    3314 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 09:59:41.903900    3314 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:41.903961    3314 ssh_runner.go:195] Run: ls
	I0731 09:59:41.907292    3314 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:41.910390    3314 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:41.910402    3314 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 09:59:41.910410    3314 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:41.910420    3314 status.go:255] checking status of ha-393000-m03 ...
	I0731 09:59:41.910674    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.910698    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.919288    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51700
	I0731 09:59:41.919640    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.919986    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.920000    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.920228    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.920351    3314 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:59:41.920453    3314 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:41.920522    3314 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:59:41.921550    3314 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 09:59:41.921561    3314 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:41.921807    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.921835    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.930280    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51702
	I0731 09:59:41.930625    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.930939    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.930951    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.931163    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.931272    3314 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:59:41.931359    3314 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 09:59:41.931616    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:41.931641    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:41.940132    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51704
	I0731 09:59:41.940482    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:41.940833    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:41.940853    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:41.941069    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:41.941174    3314 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:59:41.941307    3314 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:41.941326    3314 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:59:41.941412    3314 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:59:41.941523    3314 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:59:41.941602    3314 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:59:41.941672    3314 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:59:41.976970    3314 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:41.987633    3314 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 09:59:41.987647    3314 api_server.go:166] Checking apiserver status ...
	I0731 09:59:41.987690    3314 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:59:41.998999    3314 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 09:59:42.006689    3314 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 09:59:42.006741    3314 ssh_runner.go:195] Run: ls
	I0731 09:59:42.010106    3314 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 09:59:42.013146    3314 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 09:59:42.013156    3314 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 09:59:42.013165    3314 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 09:59:42.013175    3314 status.go:255] checking status of ha-393000-m04 ...
	I0731 09:59:42.013434    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:42.013454    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:42.022155    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51708
	I0731 09:59:42.022512    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:42.022804    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:42.022816    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:42.023024    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:42.023137    3314 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 09:59:42.023216    3314 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:59:42.023283    3314 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 09:59:42.024289    3314 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 09:59:42.024299    3314 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:42.024532    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:42.024564    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:42.033448    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51710
	I0731 09:59:42.033806    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:42.034152    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:42.034165    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:42.034367    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:42.034473    3314 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 09:59:42.034555    3314 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 09:59:42.034818    3314 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:59:42.034841    3314 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:59:42.043423    3314 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51712
	I0731 09:59:42.043770    3314 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:59:42.044126    3314 main.go:141] libmachine: Using API Version  1
	I0731 09:59:42.044144    3314 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:59:42.044342    3314 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:59:42.044455    3314 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 09:59:42.044578    3314 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 09:59:42.044589    3314 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 09:59:42.044660    3314 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 09:59:42.044733    3314 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 09:59:42.044810    3314 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 09:59:42.044892    3314 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 09:59:42.077046    3314 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:59:42.087623    3314 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (452.761832ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:00:05.804933    3624 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:00:05.805222    3624 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:05.805227    3624 out.go:304] Setting ErrFile to fd 2...
	I0731 10:00:05.805231    3624 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:05.805424    3624 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:00:05.805612    3624 out.go:298] Setting JSON to false
	I0731 10:00:05.805635    3624 mustload.go:65] Loading cluster: ha-393000
	I0731 10:00:05.805674    3624 notify.go:220] Checking for updates...
	I0731 10:00:05.805973    3624 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:05.805989    3624 status.go:255] checking status of ha-393000 ...
	I0731 10:00:05.806337    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:05.806398    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:05.815506    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51718
	I0731 10:00:05.815833    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:05.816216    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:05.816226    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:05.816453    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:05.816594    3624 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:00:05.816680    3624 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:05.816760    3624 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 10:00:05.817748    3624 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 10:00:05.817772    3624 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:00:05.818019    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:05.818039    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:05.826551    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51720
	I0731 10:00:05.826885    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:05.827227    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:05.827244    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:05.827486    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:05.827603    3624 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:05.827684    3624 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:00:05.827932    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:05.827974    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:05.836388    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51722
	I0731 10:00:05.836703    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:05.837004    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:05.837020    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:05.837241    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:05.837344    3624 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:05.837501    3624 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:00:05.837518    3624 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:05.837595    3624 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:05.837694    3624 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:05.837769    3624 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:05.837855    3624 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:05.874389    3624 ssh_runner.go:195] Run: systemctl --version
	I0731 10:00:05.878642    3624 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:00:05.890703    3624 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:00:05.890727    3624 api_server.go:166] Checking apiserver status ...
	I0731 10:00:05.890763    3624 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:00:05.902919    3624 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup
	W0731 10:00:05.911027    3624 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2063/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:00:05.911077    3624 ssh_runner.go:195] Run: ls
	I0731 10:00:05.914285    3624 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 10:00:05.917297    3624 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 10:00:05.917308    3624 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 10:00:05.917317    3624 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:00:05.917328    3624 status.go:255] checking status of ha-393000-m02 ...
	I0731 10:00:05.917583    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:05.917610    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:05.926446    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51726
	I0731 10:00:05.926794    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:05.927123    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:05.927138    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:05.927355    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:05.927477    3624 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:00:05.927568    3624 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:05.927658    3624 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 10:00:05.928670    3624 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 10:00:05.928681    3624 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 10:00:05.928948    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:05.928974    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:05.937601    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51728
	I0731 10:00:05.937933    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:05.938250    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:05.938278    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:05.938514    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:05.938630    3624 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:00:05.938715    3624 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 10:00:05.938981    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:05.939005    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:05.947556    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51730
	I0731 10:00:05.947888    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:05.948200    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:05.948211    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:05.948409    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:05.948521    3624 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:05.948640    3624 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:00:05.948652    3624 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:00:05.948732    3624 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:00:05.948829    3624 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:05.948926    3624 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:00:05.949008    3624 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:00:05.984934    3624 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:00:05.995823    3624 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:00:05.995837    3624 api_server.go:166] Checking apiserver status ...
	I0731 10:00:05.995873    3624 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:00:06.007865    3624 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup
	W0731 10:00:06.015436    3624 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2031/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:00:06.015491    3624 ssh_runner.go:195] Run: ls
	I0731 10:00:06.018799    3624 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 10:00:06.021896    3624 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 10:00:06.021908    3624 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 10:00:06.021916    3624 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:00:06.021926    3624 status.go:255] checking status of ha-393000-m03 ...
	I0731 10:00:06.022205    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:06.022225    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:06.030753    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51734
	I0731 10:00:06.031103    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:06.031460    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:06.031475    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:06.031698    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:06.031805    3624 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:00:06.031882    3624 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:06.031975    3624 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:00:06.032958    3624 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 10:00:06.032968    3624 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 10:00:06.033253    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:06.033280    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:06.041988    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51736
	I0731 10:00:06.042325    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:06.042659    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:06.042669    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:06.042892    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:06.043010    3624 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:00:06.043093    3624 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 10:00:06.043358    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:06.043382    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:06.052110    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51738
	I0731 10:00:06.052512    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:06.052875    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:06.052899    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:06.053103    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:06.053232    3624 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:00:06.053370    3624 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:00:06.053392    3624 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:00:06.053478    3624 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:00:06.053565    3624 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:00:06.053651    3624 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:00:06.053741    3624 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:00:06.087745    3624 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:00:06.099417    3624 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:00:06.099433    3624 api_server.go:166] Checking apiserver status ...
	I0731 10:00:06.099475    3624 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:00:06.111402    3624 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup
	W0731 10:00:06.119425    3624 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2048/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:00:06.119480    3624 ssh_runner.go:195] Run: ls
	I0731 10:00:06.122575    3624 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 10:00:06.125638    3624 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 10:00:06.125653    3624 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 10:00:06.125661    3624 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:00:06.125673    3624 status.go:255] checking status of ha-393000-m04 ...
	I0731 10:00:06.125938    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:06.125958    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:06.134506    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51742
	I0731 10:00:06.134875    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:06.135200    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:06.135209    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:06.135409    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:06.135523    3624 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:00:06.135609    3624 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:06.135704    3624 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 10:00:06.136709    3624 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 10:00:06.136720    3624 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 10:00:06.136986    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:06.137011    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:06.145610    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51744
	I0731 10:00:06.145945    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:06.146321    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:06.146338    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:06.146550    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:06.146660    3624 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:00:06.146755    3624 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 10:00:06.146998    3624 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:06.147024    3624 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:06.155518    3624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51746
	I0731 10:00:06.155855    3624 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:06.156190    3624 main.go:141] libmachine: Using API Version  1
	I0731 10:00:06.156200    3624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:06.156415    3624 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:06.156556    3624 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:00:06.156689    3624 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:00:06.156700    3624 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:00:06.156777    3624 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:00:06.156857    3624 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:00:06.156931    3624 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:00:06.157009    3624 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:00:06.190659    3624 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:00:06.201922    3624 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:432: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:244: <<< TestMultiControlPlane/serial/RestartSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/RestartSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (2.115747003s)
helpers_test.go:252: TestMultiControlPlane/serial/RestartSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| delete  | -p functional-680000                 | functional-680000 | jenkins | v1.33.1 | 31 Jul 24 09:53 PDT | 31 Jul 24 09:53 PDT |
	| start   | -p ha-393000 --wait=true             | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:53 PDT | 31 Jul 24 09:56 PDT |
	|         | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|         | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|         | --driver=hyperkit                    |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- apply -f             | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- rollout status       | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |                   |         |         |                     |                     |
	|         | -- sh -c nslookup                    |                   |         |         |                     |                     |
	|         | host.minikube.internal | awk         |                   |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |                   |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |                   |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |                   |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	| node    | ha-393000 node start m02 -v=7        | ha-393000         | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:59 PDT |
	|         | --alsologtostderr                    |                   |         |         |                     |                     |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 09:53:16
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 09:53:16.140722    2954 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:53:16.140891    2954 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:53:16.140897    2954 out.go:304] Setting ErrFile to fd 2...
	I0731 09:53:16.140901    2954 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:53:16.141085    2954 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:53:16.142669    2954 out.go:298] Setting JSON to false
	I0731 09:53:16.166361    2954 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1366,"bootTime":1722443430,"procs":467,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:53:16.166460    2954 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:53:16.192371    2954 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 09:53:16.233499    2954 notify.go:220] Checking for updates...
	I0731 09:53:16.263444    2954 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 09:53:16.328756    2954 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:53:16.398694    2954 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:53:16.420465    2954 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:53:16.443406    2954 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:53:16.464565    2954 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 09:53:16.486871    2954 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:53:16.517461    2954 out.go:177] * Using the hyperkit driver based on user configuration
	I0731 09:53:16.559490    2954 start.go:297] selected driver: hyperkit
	I0731 09:53:16.559519    2954 start.go:901] validating driver "hyperkit" against <nil>
	I0731 09:53:16.559538    2954 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 09:53:16.563960    2954 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:53:16.564071    2954 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 09:53:16.572413    2954 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 09:53:16.576399    2954 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:53:16.576420    2954 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 09:53:16.576454    2954 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 09:53:16.576646    2954 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:53:16.576708    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:16.576719    2954 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0731 09:53:16.576725    2954 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0731 09:53:16.576791    2954 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docke
r CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0
GPUs: AutoPauseInterval:1m0s}
	I0731 09:53:16.576877    2954 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:53:16.619419    2954 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 09:53:16.640390    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:53:16.640480    2954 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 09:53:16.640509    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:53:16.640712    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:53:16.640731    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:53:16.641227    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:53:16.641275    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json: {Name:mka52f595799559e261228b691f11b60413ee780 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:16.641876    2954 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:53:16.641986    2954 start.go:364] duration metric: took 90.888µs to acquireMachinesLock for "ha-393000"
	I0731 09:53:16.642025    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType
:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:53:16.642108    2954 start.go:125] createHost starting for "" (driver="hyperkit")
	I0731 09:53:16.663233    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:53:16.663389    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:53:16.663426    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:53:16.672199    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51037
	I0731 09:53:16.672559    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:53:16.672976    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:53:16.672987    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:53:16.673241    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:53:16.673369    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:16.673473    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:16.673584    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:53:16.673605    2954 client.go:168] LocalClient.Create starting
	I0731 09:53:16.673642    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:53:16.673693    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:53:16.673710    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:53:16.673763    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:53:16.673801    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:53:16.673815    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:53:16.673840    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:53:16.673850    2954 main.go:141] libmachine: (ha-393000) Calling .PreCreateCheck
	I0731 09:53:16.673929    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:16.674073    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:16.684622    2954 main.go:141] libmachine: Creating machine...
	I0731 09:53:16.684647    2954 main.go:141] libmachine: (ha-393000) Calling .Create
	I0731 09:53:16.684806    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:16.685170    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.684943    2962 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:53:16.685305    2954 main.go:141] libmachine: (ha-393000) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:53:16.866642    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.866533    2962 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa...
	I0731 09:53:16.907777    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.907707    2962 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk...
	I0731 09:53:16.907795    2954 main.go:141] libmachine: (ha-393000) DBG | Writing magic tar header
	I0731 09:53:16.907815    2954 main.go:141] libmachine: (ha-393000) DBG | Writing SSH key tar header
	I0731 09:53:16.908296    2954 main.go:141] libmachine: (ha-393000) DBG | I0731 09:53:16.908249    2962 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000 ...
	I0731 09:53:17.278530    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:17.278549    2954 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 09:53:17.278657    2954 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 09:53:17.388690    2954 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 09:53:17.388709    2954 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:53:17.388758    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:53:17.388793    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d0240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:53:17.388830    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:53:17.388871    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:53:17.388884    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:53:17.391787    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 DEBUG: hyperkit: Pid is 2965
	I0731 09:53:17.392177    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 09:53:17.392188    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:17.392264    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:17.393257    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:17.393317    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:17.393342    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:17.393359    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:17.393369    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:17.399449    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:53:17.451566    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:53:17.452146    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:53:17.452168    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:53:17.452176    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:53:17.452184    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:53:17.832667    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:53:17.832680    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:53:17.947165    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:53:17.947181    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:53:17.947203    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:53:17.947214    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:53:17.948083    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:53:17.948094    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:17 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:53:19.393474    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 1
	I0731 09:53:19.393491    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:19.393544    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:19.394408    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:19.394431    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:19.394439    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:19.394449    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:19.394461    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:21.396273    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 2
	I0731 09:53:21.396290    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:21.396404    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:21.397210    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:21.397262    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:21.397275    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:21.397283    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:21.397292    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:23.397619    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 3
	I0731 09:53:23.397635    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:23.397733    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:23.398576    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:23.398585    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:23.398595    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:23.398604    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:23.398623    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:23.511265    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 09:53:23.511317    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 09:53:23.511327    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 09:53:23.534471    2954 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 09:53:23 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 09:53:25.399722    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 4
	I0731 09:53:25.399735    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:25.399799    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:25.400596    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:25.400655    2954 main.go:141] libmachine: (ha-393000) DBG | Found 3 entries in /var/db/dhcpd_leases!
	I0731 09:53:25.400665    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:53:25.400672    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:53:25.400681    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:53:27.400848    2954 main.go:141] libmachine: (ha-393000) DBG | Attempt 5
	I0731 09:53:27.400872    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:27.400976    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:27.401778    2954 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 09:53:27.401824    2954 main.go:141] libmachine: (ha-393000) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:53:27.401836    2954 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:53:27.401845    2954 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 09:53:27.401856    2954 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 09:53:27.401921    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:27.402530    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:27.402623    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:27.402706    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:53:27.402714    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:53:27.402795    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:53:27.402846    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:53:27.403621    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:53:27.403635    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:53:27.403641    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:53:27.403647    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:27.403727    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:27.403804    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:27.403889    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:27.403968    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:27.404083    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:27.404258    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:27.404265    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:53:28.471124    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:53:28.471139    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:53:28.471151    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.471303    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.471413    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.471516    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.471604    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.471751    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.471894    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.471902    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:53:28.534700    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:53:28.534755    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:53:28.534761    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:53:28.534766    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.534914    2954 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 09:53:28.534924    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.535023    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.535122    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.535205    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.535305    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.535404    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.535525    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.535678    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.535686    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 09:53:28.612223    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 09:53:28.612243    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.612383    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.612495    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.612585    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.612692    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.612835    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:28.612989    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:28.613000    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:53:28.684692    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:53:28.684711    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:53:28.684731    2954 buildroot.go:174] setting up certificates
	I0731 09:53:28.684742    2954 provision.go:84] configureAuth start
	I0731 09:53:28.684753    2954 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 09:53:28.684892    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:28.684986    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.685097    2954 provision.go:143] copyHostCerts
	I0731 09:53:28.685132    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:53:28.685202    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:53:28.685210    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:53:28.685348    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:53:28.685544    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:53:28.685575    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:53:28.685580    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:53:28.685671    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:53:28.685817    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:53:28.685858    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:53:28.685863    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:53:28.685947    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:53:28.686099    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 09:53:28.975770    2954 provision.go:177] copyRemoteCerts
	I0731 09:53:28.975860    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:53:28.975879    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:28.976044    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:28.976151    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:28.976253    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:28.976368    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:29.014295    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:53:29.014364    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0731 09:53:29.033836    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:53:29.033901    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:53:29.053674    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:53:29.053744    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:53:29.073245    2954 provision.go:87] duration metric: took 388.494938ms to configureAuth
	I0731 09:53:29.073258    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:53:29.073388    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:53:29.073402    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:29.073538    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.073618    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.073712    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.073794    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.073871    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.073977    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.074114    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.074121    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:53:29.138646    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:53:29.138660    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:53:29.138727    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:53:29.138739    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.138887    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.138979    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.139070    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.139173    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.139333    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.139499    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.139544    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:53:29.214149    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:53:29.214180    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:29.214320    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:29.214403    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.214495    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:29.214599    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:29.214718    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:29.214856    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:29.214868    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:53:30.823417    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:53:30.823433    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:53:30.823439    2954 main.go:141] libmachine: (ha-393000) Calling .GetURL
	I0731 09:53:30.823574    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:53:30.823582    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:53:30.823587    2954 client.go:171] duration metric: took 14.150104113s to LocalClient.Create
	I0731 09:53:30.823598    2954 start.go:167] duration metric: took 14.150148374s to libmachine.API.Create "ha-393000"
	I0731 09:53:30.823607    2954 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 09:53:30.823621    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:53:30.823633    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.823781    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:53:30.823793    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.823880    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.823974    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.824065    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.824160    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:30.868545    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:53:30.872572    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:53:30.872587    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:53:30.872696    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:53:30.872889    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:53:30.872896    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:53:30.873123    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:53:30.890087    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:53:30.911977    2954 start.go:296] duration metric: took 88.361428ms for postStartSetup
	I0731 09:53:30.912003    2954 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 09:53:30.912600    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:30.912759    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:53:30.913103    2954 start.go:128] duration metric: took 14.271109881s to createHost
	I0731 09:53:30.913117    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.913201    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.913305    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.913399    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.913473    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.913588    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:53:30.913703    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 09:53:30.913711    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:53:30.978737    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444810.120322538
	
	I0731 09:53:30.978750    2954 fix.go:216] guest clock: 1722444810.120322538
	I0731 09:53:30.978755    2954 fix.go:229] Guest: 2024-07-31 09:53:30.120322538 -0700 PDT Remote: 2024-07-31 09:53:30.913111 -0700 PDT m=+14.813015151 (delta=-792.788462ms)
	I0731 09:53:30.978778    2954 fix.go:200] guest clock delta is within tolerance: -792.788462ms
	I0731 09:53:30.978783    2954 start.go:83] releasing machines lock for "ha-393000", held for 14.336915594s
	I0731 09:53:30.978805    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.978937    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:30.979046    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979390    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979496    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:53:30.979591    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:53:30.979625    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.979645    2954 ssh_runner.go:195] Run: cat /version.json
	I0731 09:53:30.979655    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:53:30.979750    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.979786    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:53:30.979846    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.979902    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:53:30.979927    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.979985    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:53:30.980003    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:30.980063    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:53:31.061693    2954 ssh_runner.go:195] Run: systemctl --version
	I0731 09:53:31.066472    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 09:53:31.070647    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:53:31.070687    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:53:31.084420    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:53:31.084432    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:53:31.084539    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:53:31.099368    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:53:31.108753    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:53:31.117896    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:53:31.117944    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:53:31.126974    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:53:31.135823    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:53:31.144673    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:53:31.153676    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:53:31.162890    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:53:31.171995    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:53:31.181357    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:53:31.190300    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:53:31.198317    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:53:31.206286    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:31.306658    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:53:31.325552    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:53:31.325643    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:53:31.346571    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:53:31.359753    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:53:31.393299    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:53:31.404448    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:53:31.414860    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:53:31.437636    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:53:31.448198    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:53:31.464071    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:53:31.467113    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:53:31.474646    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:53:31.488912    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:53:31.589512    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:53:31.693775    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:53:31.693845    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:53:31.709549    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:31.811094    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:53:34.149023    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.337932224s)
	I0731 09:53:34.149088    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:53:34.161198    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:53:34.175766    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:53:34.187797    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:53:34.283151    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:53:34.377189    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:34.469067    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:53:34.482248    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:53:34.492385    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:34.587912    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:53:34.647834    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:53:34.647904    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:53:34.652204    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:53:34.652250    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:53:34.655108    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:53:34.680326    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:53:34.680403    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:53:34.699387    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:53:34.764313    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:53:34.764369    2954 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 09:53:34.764763    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:53:34.769523    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:53:34.780319    2954 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Moun
tType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 09:53:34.780379    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:53:34.780438    2954 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 09:53:34.792271    2954 docker.go:685] Got preloaded images: 
	I0731 09:53:34.792283    2954 docker.go:691] registry.k8s.io/kube-apiserver:v1.30.3 wasn't preloaded
	I0731 09:53:34.792332    2954 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0731 09:53:34.800298    2954 ssh_runner.go:195] Run: which lz4
	I0731 09:53:34.803039    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0731 09:53:34.803157    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0731 09:53:34.806121    2954 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0731 09:53:34.806135    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (359612007 bytes)
	I0731 09:53:35.858525    2954 docker.go:649] duration metric: took 1.055419334s to copy over tarball
	I0731 09:53:35.858591    2954 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0731 09:53:38.196952    2954 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.338365795s)
	I0731 09:53:38.196967    2954 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0731 09:53:38.223533    2954 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0731 09:53:38.232307    2954 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2630 bytes)
	I0731 09:53:38.245888    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:38.355987    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:53:40.705059    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.349073816s)
	I0731 09:53:40.705149    2954 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 09:53:40.718481    2954 docker.go:685] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0731 09:53:40.718506    2954 cache_images.go:84] Images are preloaded, skipping loading
	I0731 09:53:40.718529    2954 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 09:53:40.718621    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:53:40.718689    2954 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 09:53:40.756905    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:40.756918    2954 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0731 09:53:40.756931    2954 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 09:53:40.756946    2954 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 09:53:40.757028    2954 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 09:53:40.757045    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:53:40.757094    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:53:40.770142    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:53:40.770212    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:53:40.770264    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:53:40.778467    2954 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 09:53:40.778510    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 09:53:40.786404    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 09:53:40.799629    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:53:40.814270    2954 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 09:53:40.827819    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1446 bytes)
	I0731 09:53:40.841352    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:53:40.844280    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:53:40.854288    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:53:40.961875    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:53:40.976988    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 09:53:40.977000    2954 certs.go:194] generating shared ca certs ...
	I0731 09:53:40.977011    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:40.977205    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:53:40.977278    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:53:40.977287    2954 certs.go:256] generating profile certs ...
	I0731 09:53:40.977331    2954 certs.go:363] generating signed profile cert for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:53:40.977344    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt with IP's: []
	I0731 09:53:41.064733    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt ...
	I0731 09:53:41.064749    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt: {Name:mk11f8b5ec16b878c9f692ccaff9a489ecc76fb2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.065074    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key ...
	I0731 09:53:41.065082    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key: {Name:mk18e6554cf3c807804faf77a7a9620e92860212 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.065322    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9
	I0731 09:53:41.065337    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.254]
	I0731 09:53:41.267360    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 ...
	I0731 09:53:41.267375    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9: {Name:mk9c13a9d071c94395118e1f00f992954683ef5b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.267745    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9 ...
	I0731 09:53:41.267755    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9: {Name:mk49f9f4ab2c1350a3cdb49ded7d6cffd5f069e5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.267965    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.144083e9 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:53:41.268145    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.144083e9 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:53:41.268307    2954 certs.go:363] generating signed profile cert for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:53:41.268320    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt with IP's: []
	I0731 09:53:41.352486    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt ...
	I0731 09:53:41.352499    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt: {Name:mk6759a3c690d7a9e990f65c338d22538c5b127a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.352775    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key ...
	I0731 09:53:41.352788    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key: {Name:mk4f661b46725a943b9862deb5f02f250855a1b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:53:41.352992    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:53:41.353021    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:53:41.353040    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:53:41.353059    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:53:41.353078    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:53:41.353096    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:53:41.353115    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:53:41.353132    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:53:41.353229    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:53:41.353280    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:53:41.353289    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:53:41.353319    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:53:41.353348    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:53:41.353377    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:53:41.353444    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:53:41.353475    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.353494    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.353511    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.353950    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:53:41.373611    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:53:41.392573    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:53:41.412520    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:53:41.433349    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0731 09:53:41.452365    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0731 09:53:41.472032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:53:41.491092    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:53:41.510282    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:53:41.529242    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:53:41.549127    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:53:41.568112    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 09:53:41.581548    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:53:41.585729    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:53:41.594979    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.598924    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.598977    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:53:41.603300    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:53:41.612561    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:53:41.621665    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.624970    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.625005    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:53:41.629117    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:53:41.638283    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:53:41.647422    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.650741    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.650776    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:53:41.654995    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:53:41.664976    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:53:41.668030    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:53:41.668072    2954 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountTy
pe:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:53:41.668156    2954 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 09:53:41.680752    2954 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 09:53:41.691788    2954 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0731 09:53:41.701427    2954 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0731 09:53:41.710462    2954 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0731 09:53:41.710473    2954 kubeadm.go:157] found existing configuration files:
	
	I0731 09:53:41.710522    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0731 09:53:41.718051    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0731 09:53:41.718109    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0731 09:53:41.726696    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0731 09:53:41.737698    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0731 09:53:41.737751    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0731 09:53:41.745907    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0731 09:53:41.753641    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0731 09:53:41.753680    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0731 09:53:41.761450    2954 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0731 09:53:41.769156    2954 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0731 09:53:41.769207    2954 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0731 09:53:41.777068    2954 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0731 09:53:41.848511    2954 kubeadm.go:310] [init] Using Kubernetes version: v1.30.3
	I0731 09:53:41.848564    2954 kubeadm.go:310] [preflight] Running pre-flight checks
	I0731 09:53:41.937481    2954 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0731 09:53:41.937568    2954 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0731 09:53:41.937658    2954 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0731 09:53:42.093209    2954 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0731 09:53:42.137661    2954 out.go:204]   - Generating certificates and keys ...
	I0731 09:53:42.137715    2954 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0731 09:53:42.137758    2954 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0731 09:53:42.784132    2954 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0731 09:53:42.954915    2954 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0731 09:53:43.064099    2954 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0731 09:53:43.107145    2954 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0731 09:53:43.256550    2954 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0731 09:53:43.256643    2954 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-393000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0731 09:53:43.365808    2954 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0731 09:53:43.365910    2954 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-393000 localhost] and IPs [192.169.0.5 127.0.0.1 ::1]
	I0731 09:53:43.496987    2954 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0731 09:53:43.811530    2954 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0731 09:53:43.998883    2954 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0731 09:53:43.999156    2954 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0731 09:53:44.246352    2954 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0731 09:53:44.460463    2954 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0731 09:53:44.552236    2954 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0731 09:53:44.656335    2954 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0731 09:53:44.920852    2954 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0731 09:53:44.921188    2954 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0731 09:53:44.922677    2954 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0731 09:53:44.944393    2954 out.go:204]   - Booting up control plane ...
	I0731 09:53:44.944462    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0731 09:53:44.944530    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0731 09:53:44.944583    2954 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0731 09:53:44.944663    2954 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0731 09:53:44.944728    2954 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0731 09:53:44.944759    2954 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0731 09:53:45.048317    2954 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0731 09:53:45.048393    2954 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0731 09:53:45.548165    2954 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 500.802272ms
	I0731 09:53:45.548224    2954 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0731 09:53:51.610602    2954 kubeadm.go:310] [api-check] The API server is healthy after 6.066816222s
	I0731 09:53:51.618854    2954 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0731 09:53:51.625868    2954 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0731 09:53:51.637830    2954 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0731 09:53:51.637998    2954 kubeadm.go:310] [mark-control-plane] Marking the node ha-393000 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0731 09:53:51.650953    2954 kubeadm.go:310] [bootstrap-token] Using token: wt4o9v.66pnb4w7anxpqs79
	I0731 09:53:51.687406    2954 out.go:204]   - Configuring RBAC rules ...
	I0731 09:53:51.687587    2954 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0731 09:53:51.690002    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0731 09:53:51.716618    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0731 09:53:51.718333    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0731 09:53:51.720211    2954 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0731 09:53:51.722003    2954 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0731 09:53:52.016537    2954 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0731 09:53:52.431449    2954 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0731 09:53:53.015675    2954 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0731 09:53:53.016431    2954 kubeadm.go:310] 
	I0731 09:53:53.016524    2954 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0731 09:53:53.016539    2954 kubeadm.go:310] 
	I0731 09:53:53.016612    2954 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0731 09:53:53.016623    2954 kubeadm.go:310] 
	I0731 09:53:53.016649    2954 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0731 09:53:53.016721    2954 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0731 09:53:53.016763    2954 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0731 09:53:53.016773    2954 kubeadm.go:310] 
	I0731 09:53:53.016814    2954 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0731 09:53:53.016821    2954 kubeadm.go:310] 
	I0731 09:53:53.016868    2954 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0731 09:53:53.016891    2954 kubeadm.go:310] 
	I0731 09:53:53.016935    2954 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0731 09:53:53.017005    2954 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0731 09:53:53.017059    2954 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0731 09:53:53.017072    2954 kubeadm.go:310] 
	I0731 09:53:53.017139    2954 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0731 09:53:53.017203    2954 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0731 09:53:53.017207    2954 kubeadm.go:310] 
	I0731 09:53:53.017269    2954 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token wt4o9v.66pnb4w7anxpqs79 \
	I0731 09:53:53.017353    2954 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 \
	I0731 09:53:53.017373    2954 kubeadm.go:310] 	--control-plane 
	I0731 09:53:53.017381    2954 kubeadm.go:310] 
	I0731 09:53:53.017452    2954 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0731 09:53:53.017461    2954 kubeadm.go:310] 
	I0731 09:53:53.017528    2954 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token wt4o9v.66pnb4w7anxpqs79 \
	I0731 09:53:53.017610    2954 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 
	I0731 09:53:53.018224    2954 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0731 09:53:53.018239    2954 cni.go:84] Creating CNI manager for ""
	I0731 09:53:53.018245    2954 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0731 09:53:53.040097    2954 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0731 09:53:53.097376    2954 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0731 09:53:53.101992    2954 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.3/kubectl ...
	I0731 09:53:53.102004    2954 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0731 09:53:53.115926    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0731 09:53:53.335699    2954 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0731 09:53:53.335768    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:53.335769    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000 minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=true
	I0731 09:53:53.489955    2954 ops.go:34] apiserver oom_adj: -16
	I0731 09:53:53.490022    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:53.990085    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:54.490335    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:54.991422    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:55.490608    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:55.990200    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:56.490175    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:56.990807    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:57.491373    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:57.991164    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:58.491587    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:58.990197    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:59.490119    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:53:59.990444    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:00.490776    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:00.990123    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:01.490685    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:01.991905    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:02.490505    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:02.990148    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:03.490590    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:03.990745    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:04.491071    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:04.991117    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:05.490027    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0731 09:54:05.576301    2954 kubeadm.go:1113] duration metric: took 12.240698872s to wait for elevateKubeSystemPrivileges
	I0731 09:54:05.576324    2954 kubeadm.go:394] duration metric: took 23.908471214s to StartCluster
	I0731 09:54:05.576346    2954 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:05.576441    2954 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:54:05.576993    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:05.577274    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0731 09:54:05.577286    2954 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:05.577302    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:54:05.577319    2954 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 09:54:05.577357    2954 addons.go:69] Setting storage-provisioner=true in profile "ha-393000"
	I0731 09:54:05.577363    2954 addons.go:69] Setting default-storageclass=true in profile "ha-393000"
	I0731 09:54:05.577386    2954 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-393000"
	I0731 09:54:05.577386    2954 addons.go:234] Setting addon storage-provisioner=true in "ha-393000"
	I0731 09:54:05.577408    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:05.577423    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:05.577661    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.577669    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.577675    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.577679    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.587150    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51060
	I0731 09:54:05.587233    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51061
	I0731 09:54:05.587573    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.587584    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.587918    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.587919    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.587930    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.587931    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.588210    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.588232    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.588358    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.588454    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.588531    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.588614    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.588639    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.590714    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:54:05.590994    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 09:54:05.591385    2954 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 09:54:05.591537    2954 addons.go:234] Setting addon default-storageclass=true in "ha-393000"
	I0731 09:54:05.591560    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:05.591783    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.591798    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.597469    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51064
	I0731 09:54:05.597830    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.598161    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.598171    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.598405    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.598520    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.598612    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.598688    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.599681    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:05.600339    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51066
	I0731 09:54:05.600677    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.601035    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.601051    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.601254    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.601611    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:05.601637    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:05.610207    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51068
	I0731 09:54:05.610548    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:05.610892    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:05.610909    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:05.611149    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:05.611266    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:05.611351    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:05.611421    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:05.612421    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:05.612552    2954 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0731 09:54:05.612560    2954 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0731 09:54:05.612568    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:05.612695    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:05.612786    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:05.612891    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:05.612974    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:05.623428    2954 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0731 09:54:05.644440    2954 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0731 09:54:05.644452    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0731 09:54:05.644468    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:05.644630    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:05.644723    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:05.644822    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:05.644921    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:05.653382    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.169.0.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0731 09:54:05.687318    2954 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.3/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0731 09:54:05.764200    2954 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.3/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0731 09:54:06.182319    2954 start.go:971] {"host.minikube.internal": 192.169.0.1} host record injected into CoreDNS's ConfigMap
	I0731 09:54:06.182364    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.182377    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.182560    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.182561    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.182572    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.182582    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.182588    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.182708    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.182715    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.182734    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.182830    2954 round_trippers.go:463] GET https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0731 09:54:06.182842    2954 round_trippers.go:469] Request Headers:
	I0731 09:54:06.182849    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:54:06.182854    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:54:06.189976    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:54:06.190422    2954 round_trippers.go:463] PUT https://192.169.0.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0731 09:54:06.190430    2954 round_trippers.go:469] Request Headers:
	I0731 09:54:06.190435    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:54:06.190439    2954 round_trippers.go:473]     Content-Type: application/json
	I0731 09:54:06.190441    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:54:06.192143    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:54:06.192277    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.192285    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.192466    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.192478    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.192482    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318368    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.318380    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.318552    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.318557    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318564    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.318573    2954 main.go:141] libmachine: Making call to close driver server
	I0731 09:54:06.318591    2954 main.go:141] libmachine: (ha-393000) Calling .Close
	I0731 09:54:06.318752    2954 main.go:141] libmachine: (ha-393000) DBG | Closing plugin on server side
	I0731 09:54:06.318752    2954 main.go:141] libmachine: Successfully made call to close driver server
	I0731 09:54:06.318769    2954 main.go:141] libmachine: Making call to close connection to plugin binary
	I0731 09:54:06.354999    2954 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0731 09:54:06.412621    2954 addons.go:510] duration metric: took 835.314471ms for enable addons: enabled=[default-storageclass storage-provisioner]
	I0731 09:54:06.412653    2954 start.go:246] waiting for cluster config update ...
	I0731 09:54:06.412665    2954 start.go:255] writing updated cluster config ...
	I0731 09:54:06.449784    2954 out.go:177] 
	I0731 09:54:06.487284    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:06.487391    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:06.509688    2954 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 09:54:06.585678    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:54:06.585712    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:54:06.585911    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:54:06.585931    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:54:06.586023    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:06.586742    2954 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:54:06.586867    2954 start.go:364] duration metric: took 101.68µs to acquireMachinesLock for "ha-393000-m02"
	I0731 09:54:06.586897    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks
:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:06.586986    2954 start.go:125] createHost starting for "m02" (driver="hyperkit")
	I0731 09:54:06.608709    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:54:06.608788    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:06.608805    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:06.617299    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51073
	I0731 09:54:06.617638    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:06.618011    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:06.618029    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:06.618237    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:06.618326    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:06.618405    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:06.618514    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:54:06.618528    2954 client.go:168] LocalClient.Create starting
	I0731 09:54:06.618559    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:54:06.618609    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:54:06.618620    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:54:06.618668    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:54:06.618707    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:54:06.618717    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:54:06.618731    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:54:06.618737    2954 main.go:141] libmachine: (ha-393000-m02) Calling .PreCreateCheck
	I0731 09:54:06.618808    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:06.618841    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:06.646223    2954 main.go:141] libmachine: Creating machine...
	I0731 09:54:06.646236    2954 main.go:141] libmachine: (ha-393000-m02) Calling .Create
	I0731 09:54:06.646361    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:06.646520    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.646351    2979 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:54:06.646597    2954 main.go:141] libmachine: (ha-393000-m02) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:54:06.831715    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.831641    2979 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa...
	I0731 09:54:06.939142    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.939044    2979 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk...
	I0731 09:54:06.939162    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Writing magic tar header
	I0731 09:54:06.939170    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Writing SSH key tar header
	I0731 09:54:06.940042    2954 main.go:141] libmachine: (ha-393000-m02) DBG | I0731 09:54:06.939949    2979 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02 ...
	I0731 09:54:07.311809    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:07.311824    2954 main.go:141] libmachine: (ha-393000-m02) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 09:54:07.311866    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 09:54:07.337818    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 09:54:07.337835    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:54:07.337884    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:54:07.337912    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:54:07.337954    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:54:07.337986    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:54:07.338000    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:54:07.340860    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 DEBUG: hyperkit: Pid is 2980
	I0731 09:54:07.341360    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 09:54:07.341374    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:07.341426    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:07.342343    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:07.342405    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:07.342418    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:07.342433    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:07.342443    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:07.342451    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:07.348297    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:54:07.357913    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:54:07.358688    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:54:07.358712    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:54:07.358723    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:54:07.358740    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:54:07.743017    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:54:07.743035    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:54:07.858034    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:54:07.858062    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:54:07.858072    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:54:07.858084    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:54:07.858884    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:54:07.858896    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:07 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:54:09.343775    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 1
	I0731 09:54:09.343792    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:09.343900    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:09.344720    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:09.344781    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:09.344792    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:09.344804    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:09.344817    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:09.344826    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:11.346829    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 2
	I0731 09:54:11.346846    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:11.346940    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:11.347752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:11.347766    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:11.347784    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:11.347795    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:11.347819    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:11.347832    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:13.348981    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 3
	I0731 09:54:13.349001    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:13.349109    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:13.349907    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:13.349943    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:13.349954    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:13.349965    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:13.349972    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:13.349980    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:13.459282    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:54:13.459342    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:54:13.459355    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:54:13.483197    2954 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 09:54:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:54:15.351752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 4
	I0731 09:54:15.351769    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:15.351820    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:15.352675    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:15.352721    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 4 entries in /var/db/dhcpd_leases!
	I0731 09:54:15.352735    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:54:15.352744    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:54:15.352752    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:54:15.352760    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:54:17.353423    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 5
	I0731 09:54:17.353439    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:17.353530    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:17.354334    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 09:54:17.354363    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:54:17.354369    2954 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:54:17.354392    2954 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 09:54:17.354398    2954 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 09:54:17.354469    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:17.355226    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:17.355356    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:17.355457    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:54:17.355466    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 09:54:17.355564    2954 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:17.355626    2954 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 2980
	I0731 09:54:17.356407    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:54:17.356415    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:54:17.356426    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:54:17.356432    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:17.356529    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:17.356628    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:17.356727    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:17.356823    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:17.356939    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:17.357111    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:17.357118    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:54:18.376907    2954 main.go:141] libmachine: Error dialing TCP: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
	I0731 09:54:21.440008    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:54:21.440021    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:54:21.440026    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.440157    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.440265    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.440360    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.440445    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.440567    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.440720    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.440728    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:54:21.502840    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:54:21.502894    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:54:21.502900    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:54:21.502905    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.503041    2954 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 09:54:21.503052    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.503150    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.503242    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.503322    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.503392    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.503473    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.503584    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.503728    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.503737    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 09:54:21.579730    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 09:54:21.579745    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.579874    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.579976    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.580070    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.580163    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.580287    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.580427    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.580439    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:54:21.651021    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:54:21.651038    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:54:21.651048    2954 buildroot.go:174] setting up certificates
	I0731 09:54:21.651054    2954 provision.go:84] configureAuth start
	I0731 09:54:21.651061    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 09:54:21.651192    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:21.651290    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.651382    2954 provision.go:143] copyHostCerts
	I0731 09:54:21.651408    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:54:21.651454    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:54:21.651459    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:54:21.651611    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:54:21.651812    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:54:21.651848    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:54:21.651853    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:54:21.651933    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:54:21.652069    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:54:21.652109    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:54:21.652114    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:54:21.652196    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:54:21.652337    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 09:54:21.695144    2954 provision.go:177] copyRemoteCerts
	I0731 09:54:21.695204    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:54:21.695225    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.695363    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.695457    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.695544    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.695616    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:21.734262    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:54:21.734338    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:54:21.760893    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:54:21.760979    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 09:54:21.787062    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:54:21.787131    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:54:21.807971    2954 provision.go:87] duration metric: took 156.910143ms to configureAuth
	I0731 09:54:21.807985    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:54:21.808123    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:21.808137    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:21.808270    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.808350    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.808427    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.808504    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.808592    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.808693    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.808822    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.808830    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:54:21.871923    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:54:21.871936    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:54:21.872014    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:54:21.872025    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.872159    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.872242    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.872339    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.872432    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.872558    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.872693    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.872741    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:54:21.947253    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:54:21.947272    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:21.947434    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:21.947533    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.947607    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:21.947689    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:21.947845    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:21.947990    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:21.948005    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:54:23.521299    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:54:23.521320    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:54:23.521327    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetURL
	I0731 09:54:23.521467    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:54:23.521475    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:54:23.521480    2954 client.go:171] duration metric: took 16.903099578s to LocalClient.Create
	I0731 09:54:23.521492    2954 start.go:167] duration metric: took 16.903132869s to libmachine.API.Create "ha-393000"
	I0731 09:54:23.521498    2954 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 09:54:23.521504    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:54:23.521519    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.521663    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:54:23.521677    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.521769    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.521859    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.521933    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.522032    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:23.560604    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:54:23.563782    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:54:23.563793    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:54:23.563892    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:54:23.564080    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:54:23.564086    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:54:23.564293    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:54:23.571517    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:54:23.591429    2954 start.go:296] duration metric: took 69.922656ms for postStartSetup
	I0731 09:54:23.591460    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 09:54:23.592068    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:23.592212    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:54:23.592596    2954 start.go:128] duration metric: took 17.005735325s to createHost
	I0731 09:54:23.592609    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.592713    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.592826    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.592928    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.593022    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.593148    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:54:23.593279    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 09:54:23.593287    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:54:23.656618    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444863.810880618
	
	I0731 09:54:23.656630    2954 fix.go:216] guest clock: 1722444863.810880618
	I0731 09:54:23.656635    2954 fix.go:229] Guest: 2024-07-31 09:54:23.810880618 -0700 PDT Remote: 2024-07-31 09:54:23.592602 -0700 PDT m=+67.492982270 (delta=218.278618ms)
	I0731 09:54:23.656654    2954 fix.go:200] guest clock delta is within tolerance: 218.278618ms
	I0731 09:54:23.656663    2954 start.go:83] releasing machines lock for "ha-393000-m02", held for 17.069938552s
	I0731 09:54:23.656681    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.656811    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:23.684522    2954 out.go:177] * Found network options:
	I0731 09:54:23.836571    2954 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 09:54:23.866932    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:54:23.866975    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.867861    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.868089    2954 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 09:54:23.868209    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:54:23.868288    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 09:54:23.868332    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:54:23.868439    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 09:54:23.868462    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 09:54:23.868525    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.868708    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 09:54:23.868756    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.868922    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 09:54:23.868944    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.869058    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 09:54:23.869081    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 09:54:23.869206    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 09:54:23.904135    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:54:23.904205    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:54:23.927324    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:54:23.927338    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:54:23.927400    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:54:23.970222    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:54:23.978777    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:54:23.987481    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:54:23.987533    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:54:23.996430    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:54:24.004692    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:54:24.012968    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:54:24.021204    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:54:24.030482    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:54:24.038802    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:54:24.047006    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:54:24.055781    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:54:24.063050    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:54:24.072089    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:24.169406    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:54:24.189452    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:54:24.189519    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:54:24.202393    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:54:24.214821    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:54:24.229583    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:54:24.240171    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:54:24.250428    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:54:24.302946    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:54:24.313120    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:54:24.327912    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:54:24.331673    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:54:24.338902    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:54:24.352339    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:54:24.449032    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:54:24.557842    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:54:24.557870    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:54:24.571700    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:24.673137    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:54:27.047079    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.373944592s)
	I0731 09:54:27.047137    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:54:27.057410    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:54:27.071816    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:54:27.082278    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:54:27.176448    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:54:27.277016    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:27.384870    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:54:27.398860    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:54:27.409735    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:27.507837    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:54:27.568313    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:54:27.568381    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:54:27.573262    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:54:27.573320    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:54:27.579109    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:54:27.606116    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:54:27.606208    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:54:27.625621    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:54:27.663443    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:54:27.704938    2954 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 09:54:27.726212    2954 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 09:54:27.726560    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:54:27.730336    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:54:27.740553    2954 mustload.go:65] Loading cluster: ha-393000
	I0731 09:54:27.740700    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:54:27.740921    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:27.740943    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:27.749667    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51097
	I0731 09:54:27.750028    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:27.750384    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:27.750401    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:27.750596    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:27.750732    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:54:27.750813    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:54:27.750888    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:54:27.751853    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:27.752094    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:27.752117    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:27.760565    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51099
	I0731 09:54:27.760882    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:27.761210    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:27.761223    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:27.761435    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:27.761551    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:27.761648    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 09:54:27.761653    2954 certs.go:194] generating shared ca certs ...
	I0731 09:54:27.761672    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.761836    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:54:27.761936    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:54:27.761945    2954 certs.go:256] generating profile certs ...
	I0731 09:54:27.762034    2954 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:54:27.762058    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069
	I0731 09:54:27.762073    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.254]
	I0731 09:54:27.834156    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 ...
	I0731 09:54:27.834169    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069: {Name:mk0062f228b9fa8374eba60d674a49cb0265b988 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.834495    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069 ...
	I0731 09:54:27.834504    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069: {Name:mkd62a5cca652a59908630fd95f20d2e01386237 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:54:27.834713    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.6bbcf069 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:54:27.834929    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.6bbcf069 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:54:27.835197    2954 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:54:27.835206    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:54:27.835229    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:54:27.835247    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:54:27.835267    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:54:27.835284    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:54:27.835302    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:54:27.835321    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:54:27.835338    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:54:27.835425    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:54:27.835473    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:54:27.835481    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:54:27.835511    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:54:27.835539    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:54:27.835575    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:54:27.835647    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:54:27.835682    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:54:27.835703    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:54:27.835723    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:27.835762    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:27.835910    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:27.836005    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:27.836102    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:27.836203    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:27.868754    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 09:54:27.872390    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 09:54:27.881305    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 09:54:27.884697    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 09:54:27.893772    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 09:54:27.896980    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 09:54:27.905593    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 09:54:27.908812    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 09:54:27.916605    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 09:54:27.919921    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 09:54:27.927985    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 09:54:27.931223    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 09:54:27.940238    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:54:27.960044    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:54:27.980032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:54:27.999204    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:54:28.018549    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0731 09:54:28.037848    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 09:54:28.057376    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:54:28.076776    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:54:28.096215    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:54:28.115885    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:54:28.135490    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:54:28.154907    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 09:54:28.169275    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 09:54:28.183001    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 09:54:28.196610    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 09:54:28.210320    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 09:54:28.223811    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 09:54:28.237999    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 09:54:28.251767    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:54:28.256201    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:54:28.265361    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.268834    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.268882    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:54:28.273194    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:54:28.282819    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:54:28.292122    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.295585    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.295622    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:54:28.299894    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:54:28.308965    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:54:28.318848    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.322347    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.322383    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:54:28.326657    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:54:28.335765    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:54:28.338885    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:54:28.338923    2954 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 09:54:28.338981    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:54:28.338998    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:54:28.339031    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:54:28.352962    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:54:28.353010    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:54:28.353068    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:54:28.361447    2954 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 09:54:28.361501    2954 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 09:54:28.370031    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet
	I0731 09:54:28.370031    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm
	I0731 09:54:28.370036    2954 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl
	I0731 09:54:31.406224    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:54:31.406308    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:54:31.409804    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 09:54:31.409825    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 09:54:32.215163    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:54:32.215265    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:54:32.218832    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 09:54:32.218858    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 09:54:39.678084    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:54:39.690174    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:54:39.690295    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:54:39.693595    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 09:54:39.693614    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 09:54:39.964594    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 09:54:39.972786    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 09:54:39.986436    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:54:39.999856    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 09:54:40.013590    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:54:40.016608    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:54:40.026617    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:54:40.125738    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:54:40.142197    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:54:40.142482    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:54:40.142512    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:54:40.151352    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51126
	I0731 09:54:40.151710    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:54:40.152074    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:54:40.152091    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:54:40.152318    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:54:40.152428    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:54:40.152528    2954 start.go:317] joinCluster: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 Clu
sterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpira
tion:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:54:40.152603    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0731 09:54:40.152616    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:54:40.152722    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:54:40.152805    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:54:40.152933    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:54:40.153036    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:54:40.232831    2954 start.go:343] trying to join control-plane node "m02" to cluster: &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:54:40.232861    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token heh6bo.7n85cftszx0hevpy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443"
	I0731 09:55:07.963279    2954 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token heh6bo.7n85cftszx0hevpy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m02 --control-plane --apiserver-advertise-address=192.169.0.6 --apiserver-bind-port=8443": (27.730638671s)
	I0731 09:55:07.963316    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0731 09:55:08.368958    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000-m02 minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=false
	I0731 09:55:08.452570    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-393000-m02 node-role.kubernetes.io/control-plane:NoSchedule-
	I0731 09:55:08.540019    2954 start.go:319] duration metric: took 28.387749448s to joinCluster
	I0731 09:55:08.540065    2954 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:08.540296    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:08.563232    2954 out.go:177] * Verifying Kubernetes components...
	I0731 09:55:08.603726    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:08.841318    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:55:08.872308    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:55:08.872512    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 09:55:08.872555    2954 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 09:55:08.872732    2954 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 09:55:08.872795    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:08.872800    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:08.872806    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:08.872810    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:08.881842    2954 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 09:55:09.372875    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:09.372888    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:09.372894    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:09.372897    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:09.374975    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:09.872917    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:09.872929    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:09.872935    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:09.872939    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:09.875869    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.372943    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:10.372956    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:10.372964    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:10.372967    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:10.375041    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.874945    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:10.875020    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:10.875035    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:10.875043    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:10.877858    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:10.878307    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:11.373440    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:11.373461    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:11.373468    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:11.373472    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:11.376182    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:11.874612    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:11.874624    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:11.874630    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:11.874634    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:11.876432    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:12.374085    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:12.374098    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:12.374104    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:12.374107    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:12.376039    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:12.874234    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:12.874246    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:12.874252    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:12.874255    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:12.876210    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:13.374284    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:13.374372    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:13.374387    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:13.374396    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:13.377959    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:13.378403    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:13.873814    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:13.873839    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:13.873850    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:13.873856    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:13.876640    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:14.373497    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:14.373550    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:14.373561    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:14.373570    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:14.376681    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:14.872976    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:14.873065    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:14.873079    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:14.873087    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:14.875607    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:15.373684    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:15.373702    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:15.373711    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:15.373716    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:15.375839    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:15.873002    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:15.873028    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:15.873040    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:15.873049    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:15.876311    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:15.877408    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:16.373017    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:16.373044    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:16.373110    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:16.373119    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:16.376651    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:16.873932    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:16.873951    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:16.873958    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:16.873961    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:16.875945    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:17.372883    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:17.372963    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:17.372979    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:17.372987    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:17.375706    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:17.874312    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:17.874334    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:17.874343    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:17.874381    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:17.876575    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:18.374077    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:18.374176    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:18.374191    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:18.374197    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:18.377131    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:18.377505    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:18.874567    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:18.874589    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:18.874653    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:18.874658    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:18.877221    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:19.373331    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:19.373347    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:19.373387    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:19.373392    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:19.375412    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:19.873283    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:19.873307    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:19.873320    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:19.873326    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:19.876694    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.373050    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:20.373075    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:20.373086    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:20.373096    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:20.376371    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.874379    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:20.874402    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:20.874414    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:20.874421    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:20.877609    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:20.878167    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:21.373483    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:21.373509    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:21.373520    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:21.373526    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:21.376649    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:21.872794    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:21.872825    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:21.872832    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:21.872837    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:21.874864    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:22.373703    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:22.373721    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:22.373733    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:22.373739    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:22.376275    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:22.872731    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:22.872746    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:22.872752    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:22.872756    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:22.875078    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:23.373989    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:23.374007    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:23.374017    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:23.374021    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:23.376252    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:23.376876    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:23.874071    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:23.874095    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:23.874118    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:23.874128    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:23.877415    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:24.373797    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:24.373828    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:24.373836    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:24.373842    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:24.375723    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:24.873198    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:24.873217    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:24.873239    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:24.873242    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:24.874997    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:25.373864    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:25.373964    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:25.373983    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:25.373993    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:25.376940    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:25.377783    2954 node_ready.go:53] node "ha-393000-m02" has status "Ready":"False"
	I0731 09:55:25.873066    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:25.873140    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:25.873157    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:25.873167    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:25.876035    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:26.373560    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:26.373582    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:26.373594    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:26.373600    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:26.376763    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:26.872802    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:26.872826    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:26.872847    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:26.872855    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:26.875665    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.372793    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.372848    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.372859    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.372865    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.375283    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.872817    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.872887    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.872897    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.872902    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.875143    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:27.875477    2954 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 09:55:27.875491    2954 node_ready.go:38] duration metric: took 19.002910931s for node "ha-393000-m02" to be "Ready" ...
	I0731 09:55:27.875498    2954 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:55:27.875539    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:27.875545    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.875550    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.875554    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.884028    2954 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 09:55:27.888275    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.888338    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 09:55:27.888344    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.888351    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.888354    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.895154    2954 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 09:55:27.895668    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.895676    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.895682    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.895685    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.903221    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:55:27.903585    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.903594    2954 pod_ready.go:81] duration metric: took 15.30431ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.903601    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.903644    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 09:55:27.903649    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.903655    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.903659    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.910903    2954 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0731 09:55:27.911272    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.911279    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.911284    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.911287    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.912846    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.913176    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.913184    2954 pod_ready.go:81] duration metric: took 9.57768ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.913191    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.913223    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 09:55:27.913228    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.913233    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.913237    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.914947    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.915374    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:27.915380    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.915386    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.915390    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.916800    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.917134    2954 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.917142    2954 pod_ready.go:81] duration metric: took 3.945951ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.917148    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.917182    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 09:55:27.917186    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.917192    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.917199    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.919108    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.919519    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:27.919526    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:27.919532    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:27.919538    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:27.920909    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:27.921212    2954 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:27.921221    2954 pod_ready.go:81] duration metric: took 4.068426ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:27.921231    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.073440    2954 request.go:629] Waited for 152.136555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:55:28.073539    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:55:28.073547    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.073555    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.073561    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.075944    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:28.272878    2954 request.go:629] Waited for 196.473522ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:28.272966    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:28.272972    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.272978    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.272981    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.274914    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:28.275308    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:28.275318    2954 pod_ready.go:81] duration metric: took 354.084518ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.275325    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.473409    2954 request.go:629] Waited for 198.051207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:55:28.473441    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:55:28.473447    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.473463    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.473467    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.475323    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:28.673703    2954 request.go:629] Waited for 197.835098ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:28.673754    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:28.673765    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.673772    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.673777    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.676049    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:28.676485    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:28.676497    2954 pod_ready.go:81] duration metric: took 401.169334ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.676504    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:28.874899    2954 request.go:629] Waited for 198.343236ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:55:28.875005    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:55:28.875014    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:28.875025    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:28.875031    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:28.878371    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:29.072894    2954 request.go:629] Waited for 193.894527ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:29.072997    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:29.073009    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.073020    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.073029    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.075911    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.076354    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.076367    2954 pod_ready.go:81] duration metric: took 399.859987ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.076376    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.273708    2954 request.go:629] Waited for 197.294345ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:55:29.273758    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:55:29.273806    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.273815    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.273819    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.276500    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.473244    2954 request.go:629] Waited for 196.211404ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.473347    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.473355    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.473363    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.473367    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.475855    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:29.476256    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.476266    2954 pod_ready.go:81] duration metric: took 399.888458ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.476273    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.672987    2954 request.go:629] Waited for 196.670765ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:55:29.673094    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:55:29.673114    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.673128    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.673135    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.676240    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:29.874264    2954 request.go:629] Waited for 197.423472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.874348    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:29.874352    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:29.874365    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:29.874369    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:29.876229    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:29.876542    2954 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:29.876551    2954 pod_ready.go:81] duration metric: took 400.273525ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:29.876557    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.073985    2954 request.go:629] Waited for 197.386483ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:55:30.074064    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:55:30.074071    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.074076    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.074080    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.075934    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:30.274353    2954 request.go:629] Waited for 197.921759ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.274399    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.274408    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.274421    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.274429    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.276767    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:30.277075    2954 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:30.277085    2954 pod_ready.go:81] duration metric: took 400.525562ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.277092    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.474867    2954 request.go:629] Waited for 197.733458ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:55:30.474919    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:55:30.474936    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.474949    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.474958    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.478180    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:30.673620    2954 request.go:629] Waited for 194.924994ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.673658    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:55:30.673662    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.673668    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.673674    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.675356    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:30.675625    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:30.675634    2954 pod_ready.go:81] duration metric: took 398.539654ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.675640    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:30.873712    2954 request.go:629] Waited for 198.03899ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:55:30.873795    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:55:30.873801    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:30.873807    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:30.873811    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:30.875750    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:31.074152    2954 request.go:629] Waited for 197.932145ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:31.074207    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:55:31.074215    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.074227    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.074234    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.077132    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:31.077723    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:55:31.077735    2954 pod_ready.go:81] duration metric: took 402.091925ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:55:31.077744    2954 pod_ready.go:38] duration metric: took 3.202266702s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:55:31.077770    2954 api_server.go:52] waiting for apiserver process to appear ...
	I0731 09:55:31.077872    2954 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:55:31.089706    2954 api_server.go:72] duration metric: took 22.549827849s to wait for apiserver process to appear ...
	I0731 09:55:31.089719    2954 api_server.go:88] waiting for apiserver healthz status ...
	I0731 09:55:31.089735    2954 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:55:31.093731    2954 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:55:31.093774    2954 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 09:55:31.093779    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.093785    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.093789    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.094287    2954 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 09:55:31.094337    2954 api_server.go:141] control plane version: v1.30.3
	I0731 09:55:31.094346    2954 api_server.go:131] duration metric: took 4.622445ms to wait for apiserver health ...
	I0731 09:55:31.094351    2954 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 09:55:31.272834    2954 request.go:629] Waited for 178.447514ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.272864    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.272868    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.272874    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.272879    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.275929    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:31.278922    2954 system_pods.go:59] 17 kube-system pods found
	I0731 09:55:31.278939    2954 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:55:31.278943    2954 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:55:31.278948    2954 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:55:31.278951    2954 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:55:31.278954    2954 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:55:31.278957    2954 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:55:31.278960    2954 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:55:31.278963    2954 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:55:31.278966    2954 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:55:31.278968    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:55:31.278971    2954 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:55:31.278973    2954 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:55:31.278976    2954 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:55:31.278982    2954 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:55:31.278986    2954 system_pods.go:61] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:55:31.278988    2954 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:55:31.278991    2954 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:55:31.278996    2954 system_pods.go:74] duration metric: took 184.642078ms to wait for pod list to return data ...
	I0731 09:55:31.279002    2954 default_sa.go:34] waiting for default service account to be created ...
	I0731 09:55:31.473455    2954 request.go:629] Waited for 194.413647ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:55:31.473487    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:55:31.473492    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.473498    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.473502    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.475460    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:55:31.475608    2954 default_sa.go:45] found service account: "default"
	I0731 09:55:31.475618    2954 default_sa.go:55] duration metric: took 196.612794ms for default service account to be created ...
	I0731 09:55:31.475624    2954 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 09:55:31.673326    2954 request.go:629] Waited for 197.663631ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.673362    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:55:31.673369    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.673377    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.673384    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.676582    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:55:31.680023    2954 system_pods.go:86] 17 kube-system pods found
	I0731 09:55:31.680035    2954 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:55:31.680039    2954 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:55:31.680042    2954 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:55:31.680045    2954 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:55:31.680048    2954 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:55:31.680051    2954 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:55:31.680054    2954 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:55:31.680057    2954 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:55:31.680060    2954 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:55:31.680063    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:55:31.680067    2954 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:55:31.680070    2954 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:55:31.680073    2954 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:55:31.680076    2954 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:55:31.680079    2954 system_pods.go:89] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:55:31.680082    2954 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:55:31.680085    2954 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:55:31.680089    2954 system_pods.go:126] duration metric: took 204.462284ms to wait for k8s-apps to be running ...
	I0731 09:55:31.680093    2954 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 09:55:31.680137    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:55:31.691384    2954 system_svc.go:56] duration metric: took 11.279108ms WaitForService to wait for kubelet
	I0731 09:55:31.691399    2954 kubeadm.go:582] duration metric: took 23.151526974s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:55:31.691411    2954 node_conditions.go:102] verifying NodePressure condition ...
	I0731 09:55:31.872842    2954 request.go:629] Waited for 181.393446ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 09:55:31.872873    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 09:55:31.872877    2954 round_trippers.go:469] Request Headers:
	I0731 09:55:31.872884    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:55:31.872887    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:55:31.875560    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:55:31.876076    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:55:31.876090    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:55:31.876101    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:55:31.876111    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:55:31.876115    2954 node_conditions.go:105] duration metric: took 184.70211ms to run NodePressure ...
	I0731 09:55:31.876123    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:55:31.876138    2954 start.go:255] writing updated cluster config ...
	I0731 09:55:31.896708    2954 out.go:177] 
	I0731 09:55:31.917824    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:31.917916    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:31.939502    2954 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 09:55:31.981501    2954 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:55:31.981523    2954 cache.go:56] Caching tarball of preloaded images
	I0731 09:55:31.981705    2954 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 09:55:31.981717    2954 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 09:55:31.981841    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:31.982574    2954 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 09:55:31.982642    2954 start.go:364] duration metric: took 52.194µs to acquireMachinesLock for "ha-393000-m03"
	I0731 09:55:31.982663    2954 start.go:93] Provisioning new machine with config: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ing
ress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror:
DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m03 IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:31.982776    2954 start.go:125] createHost starting for "m03" (driver="hyperkit")
	I0731 09:55:32.003523    2954 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0731 09:55:32.003599    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:32.003626    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:32.012279    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51131
	I0731 09:55:32.012622    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:32.012991    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:32.013008    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:32.013225    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:32.013332    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:32.013417    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:32.013511    2954 start.go:159] libmachine.API.Create for "ha-393000" (driver="hyperkit")
	I0731 09:55:32.013531    2954 client.go:168] LocalClient.Create starting
	I0731 09:55:32.013562    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem
	I0731 09:55:32.013605    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:55:32.013616    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:55:32.013658    2954 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem
	I0731 09:55:32.013685    2954 main.go:141] libmachine: Decoding PEM data...
	I0731 09:55:32.013695    2954 main.go:141] libmachine: Parsing certificate...
	I0731 09:55:32.013708    2954 main.go:141] libmachine: Running pre-create checks...
	I0731 09:55:32.013722    2954 main.go:141] libmachine: (ha-393000-m03) Calling .PreCreateCheck
	I0731 09:55:32.013796    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:32.013821    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:32.024803    2954 main.go:141] libmachine: Creating machine...
	I0731 09:55:32.024819    2954 main.go:141] libmachine: (ha-393000-m03) Calling .Create
	I0731 09:55:32.024954    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:32.025189    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.024948    2993 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:55:32.025311    2954 main.go:141] libmachine: (ha-393000-m03) Downloading /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso...
	I0731 09:55:32.387382    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.387300    2993 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa...
	I0731 09:55:32.468181    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.468125    2993 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk...
	I0731 09:55:32.468207    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Writing magic tar header
	I0731 09:55:32.468229    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Writing SSH key tar header
	I0731 09:55:32.468792    2954 main.go:141] libmachine: (ha-393000-m03) DBG | I0731 09:55:32.468762    2993 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03 ...
	I0731 09:55:33.078663    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:33.078680    2954 main.go:141] libmachine: (ha-393000-m03) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 09:55:33.078716    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 09:55:33.103258    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 09:55:33.103280    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 09:55:33.103347    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:55:33.103394    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001d2240)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 09:55:33.103443    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 09:55:33.103490    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 09:55:33.103507    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 09:55:33.106351    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 DEBUG: hyperkit: Pid is 2994
	I0731 09:55:33.106790    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 09:55:33.106810    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:33.106894    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:33.107878    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:33.107923    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:33.107940    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:33.107959    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:33.107977    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:33.107995    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:33.108059    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:33.114040    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 09:55:33.122160    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 09:55:33.123003    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:55:33.123036    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:55:33.123053    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:55:33.123062    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:55:33.505461    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 09:55:33.505481    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 09:55:33.620173    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 09:55:33.620193    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 09:55:33.620213    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 09:55:33.620225    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 09:55:33.621055    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 09:55:33.621064    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:33 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 09:55:35.108561    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 1
	I0731 09:55:35.108578    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:35.108664    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:35.109476    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:35.109527    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:35.109535    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:35.109543    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:35.109553    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:35.109564    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:35.109588    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:37.111452    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 2
	I0731 09:55:37.111469    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:37.111534    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:37.112347    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:37.112387    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:37.112400    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:37.112409    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:37.112418    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:37.112431    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:37.112438    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:39.113861    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 3
	I0731 09:55:39.113876    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:39.113989    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:39.114793    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:39.114841    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:39.114854    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:39.114871    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:39.114881    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:39.114894    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:39.114910    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:39.197635    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0731 09:55:39.197744    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0731 09:55:39.197756    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0731 09:55:39.222062    2954 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 09:55:39 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 0
	I0731 09:55:41.116408    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 4
	I0731 09:55:41.116425    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:41.116529    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:41.117328    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:41.117368    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 5 entries in /var/db/dhcpd_leases!
	I0731 09:55:41.117376    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbdb8}
	I0731 09:55:41.117399    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 09:55:41.117416    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.4 HWAddress:9e:4a:75:aa:a4:d ID:1,9e:4a:75:aa:a4:d Lease:0x66abbca2}
	I0731 09:55:41.117425    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.3 HWAddress:ea:f4:8d:9f:5f:6f ID:1,ea:f4:8d:9f:5f:6f Lease:0x66aa6a7f}
	I0731 09:55:41.117441    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.2 HWAddress:3e:e7:9b:4b:ef:36 ID:1,3e:e7:9b:4b:ef:36 Lease:0x66abba63}
	I0731 09:55:43.117722    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 5
	I0731 09:55:43.117737    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:43.117828    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:43.118651    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 09:55:43.118699    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found 6 entries in /var/db/dhcpd_leases!
	I0731 09:55:43.118714    2954 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 09:55:43.118721    2954 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 09:55:43.118726    2954 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 09:55:43.118795    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:43.119393    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:43.119491    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:43.119572    2954 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0731 09:55:43.119580    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 09:55:43.119659    2954 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:43.119724    2954 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 09:55:43.120517    2954 main.go:141] libmachine: Detecting operating system of created instance...
	I0731 09:55:43.120525    2954 main.go:141] libmachine: Waiting for SSH to be available...
	I0731 09:55:43.120529    2954 main.go:141] libmachine: Getting to WaitForSSH function...
	I0731 09:55:43.120540    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:43.120627    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:43.120733    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:43.120830    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:43.120937    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:43.121066    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:43.121248    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:43.121256    2954 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0731 09:55:44.180872    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:55:44.180885    2954 main.go:141] libmachine: Detecting the provisioner...
	I0731 09:55:44.180891    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.181020    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.181119    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.181200    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.181293    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.181426    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.181579    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.181587    2954 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0731 09:55:44.244214    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0731 09:55:44.244264    2954 main.go:141] libmachine: found compatible host: buildroot
	I0731 09:55:44.244271    2954 main.go:141] libmachine: Provisioning with buildroot...
	I0731 09:55:44.244277    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.244401    2954 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 09:55:44.244413    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.244502    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.244591    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.244669    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.244754    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.244838    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.244957    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.245103    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.245112    2954 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 09:55:44.315698    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 09:55:44.315714    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.315853    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.315950    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.316034    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.316117    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.316237    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.316383    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.316394    2954 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 09:55:44.383039    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 09:55:44.383055    2954 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 09:55:44.383064    2954 buildroot.go:174] setting up certificates
	I0731 09:55:44.383071    2954 provision.go:84] configureAuth start
	I0731 09:55:44.383077    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 09:55:44.383215    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:44.383314    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.383408    2954 provision.go:143] copyHostCerts
	I0731 09:55:44.383435    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:55:44.383482    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 09:55:44.383490    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 09:55:44.383608    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 09:55:44.383821    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:55:44.383853    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 09:55:44.383859    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 09:55:44.383930    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 09:55:44.384107    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:55:44.384137    2954 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 09:55:44.384146    2954 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 09:55:44.384214    2954 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 09:55:44.384364    2954 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 09:55:44.436199    2954 provision.go:177] copyRemoteCerts
	I0731 09:55:44.436250    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 09:55:44.436265    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.436405    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.436484    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.436578    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.436651    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:44.474166    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 09:55:44.474251    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 09:55:44.495026    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 09:55:44.495089    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 09:55:44.514528    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 09:55:44.514597    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 09:55:44.534382    2954 provision.go:87] duration metric: took 151.304295ms to configureAuth
	I0731 09:55:44.534397    2954 buildroot.go:189] setting minikube options for container-runtime
	I0731 09:55:44.534572    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:44.534587    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:44.534721    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.534815    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.534895    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.534982    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.535063    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.535176    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.535303    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.535311    2954 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 09:55:44.595832    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 09:55:44.595845    2954 buildroot.go:70] root file system type: tmpfs
	I0731 09:55:44.595915    2954 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 09:55:44.595926    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.596055    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.596141    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.596224    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.596312    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.596436    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.596585    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.596629    2954 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 09:55:44.668428    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 09:55:44.668446    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:44.668587    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:44.668687    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.668775    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:44.668883    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:44.669009    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:44.669153    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:44.669165    2954 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 09:55:46.245712    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 09:55:46.245728    2954 main.go:141] libmachine: Checking connection to Docker...
	I0731 09:55:46.245733    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetURL
	I0731 09:55:46.245877    2954 main.go:141] libmachine: Docker is up and running!
	I0731 09:55:46.245886    2954 main.go:141] libmachine: Reticulating splines...
	I0731 09:55:46.245891    2954 client.go:171] duration metric: took 14.176451747s to LocalClient.Create
	I0731 09:55:46.245904    2954 start.go:167] duration metric: took 14.176491485s to libmachine.API.Create "ha-393000"
	I0731 09:55:46.245910    2954 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 09:55:46.245917    2954 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 09:55:46.245936    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.246092    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 09:55:46.246107    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.246216    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.246326    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.246431    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.246511    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:46.290725    2954 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 09:55:46.294553    2954 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 09:55:46.294567    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 09:55:46.294659    2954 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 09:55:46.294805    2954 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 09:55:46.294812    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 09:55:46.294995    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 09:55:46.303032    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:55:46.335630    2954 start.go:296] duration metric: took 89.711926ms for postStartSetup
	I0731 09:55:46.335676    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 09:55:46.336339    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:46.336499    2954 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 09:55:46.336864    2954 start.go:128] duration metric: took 14.298177246s to createHost
	I0731 09:55:46.336879    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.336971    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.337062    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.337141    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.337213    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.337332    2954 main.go:141] libmachine: Using SSH client type: native
	I0731 09:55:46.337451    2954 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xfae40c0] 0xfae6e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 09:55:46.337458    2954 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 09:55:46.398217    2954 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722444946.512017695
	
	I0731 09:55:46.398229    2954 fix.go:216] guest clock: 1722444946.512017695
	I0731 09:55:46.398235    2954 fix.go:229] Guest: 2024-07-31 09:55:46.512017695 -0700 PDT Remote: 2024-07-31 09:55:46.336873 -0700 PDT m=+150.181968458 (delta=175.144695ms)
	I0731 09:55:46.398245    2954 fix.go:200] guest clock delta is within tolerance: 175.144695ms
	I0731 09:55:46.398250    2954 start.go:83] releasing machines lock for "ha-393000-m03", held for 14.359697621s
	I0731 09:55:46.398269    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.398407    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:46.418329    2954 out.go:177] * Found network options:
	I0731 09:55:46.439149    2954 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 09:55:46.477220    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 09:55:46.477241    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:55:46.477255    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.477897    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.478058    2954 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 09:55:46.478150    2954 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 09:55:46.478196    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 09:55:46.478232    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 09:55:46.478262    2954 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 09:55:46.478353    2954 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 09:55:46.478353    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.478369    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 09:55:46.478511    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.478558    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 09:55:46.478670    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.478731    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 09:55:46.478785    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 09:55:46.478828    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 09:55:46.478931    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 09:55:46.512520    2954 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 09:55:46.512591    2954 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 09:55:46.558288    2954 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 09:55:46.558305    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:55:46.558391    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:55:46.574105    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 09:55:46.582997    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 09:55:46.591920    2954 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 09:55:46.591969    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 09:55:46.600962    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:55:46.610057    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 09:55:46.619019    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 09:55:46.627876    2954 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 09:55:46.637129    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 09:55:46.646079    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 09:55:46.655162    2954 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 09:55:46.664198    2954 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 09:55:46.672256    2954 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 09:55:46.680371    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:46.778919    2954 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 09:55:46.798064    2954 start.go:495] detecting cgroup driver to use...
	I0731 09:55:46.798132    2954 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 09:55:46.815390    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:55:46.827644    2954 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 09:55:46.842559    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 09:55:46.853790    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:55:46.864444    2954 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 09:55:46.887653    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 09:55:46.898070    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 09:55:46.913256    2954 ssh_runner.go:195] Run: which cri-dockerd
	I0731 09:55:46.916263    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 09:55:46.923424    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 09:55:46.937344    2954 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 09:55:47.035092    2954 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 09:55:47.134788    2954 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 09:55:47.134810    2954 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 09:55:47.149022    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:47.247660    2954 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 09:55:49.540717    2954 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.293040269s)
	I0731 09:55:49.540778    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 09:55:49.551148    2954 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 09:55:49.563946    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:55:49.574438    2954 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 09:55:49.675905    2954 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 09:55:49.777958    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:49.889335    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 09:55:49.903338    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 09:55:49.914450    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:50.020127    2954 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 09:55:50.079269    2954 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 09:55:50.079351    2954 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 09:55:50.085411    2954 start.go:563] Will wait 60s for crictl version
	I0731 09:55:50.085468    2954 ssh_runner.go:195] Run: which crictl
	I0731 09:55:50.088527    2954 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 09:55:50.115874    2954 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 09:55:50.115947    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:55:50.133371    2954 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 09:55:50.177817    2954 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 09:55:50.199409    2954 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 09:55:50.242341    2954 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 09:55:50.263457    2954 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 09:55:50.263780    2954 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 09:55:50.267924    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:55:50.277257    2954 mustload.go:65] Loading cluster: ha-393000
	I0731 09:55:50.277434    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:55:50.277675    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:50.277699    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:50.286469    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51154
	I0731 09:55:50.286803    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:50.287152    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:50.287174    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:50.287405    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:50.287529    2954 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 09:55:50.287619    2954 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 09:55:50.287687    2954 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 09:55:50.288682    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:55:50.288947    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:50.288976    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:50.297641    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51156
	I0731 09:55:50.297976    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:50.298336    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:50.298356    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:50.298557    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:50.298695    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:55:50.298796    2954 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 09:55:50.298803    2954 certs.go:194] generating shared ca certs ...
	I0731 09:55:50.298815    2954 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.298953    2954 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 09:55:50.299004    2954 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 09:55:50.299013    2954 certs.go:256] generating profile certs ...
	I0731 09:55:50.299104    2954 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 09:55:50.299126    2954 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 09:55:50.299146    2954 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0731 09:55:50.438174    2954 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb ...
	I0731 09:55:50.438189    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb: {Name:mk221449ac60933abd0b425ad947a6ab1580c0ce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.438543    2954 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb ...
	I0731 09:55:50.438553    2954 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb: {Name:mk1cb7896668e4a7a9edaf8893989143a67a7948 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 09:55:50.438773    2954 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.01a63cdb -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 09:55:50.438957    2954 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 09:55:50.439187    2954 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 09:55:50.439201    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 09:55:50.439224    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 09:55:50.439243    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 09:55:50.439262    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 09:55:50.439280    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 09:55:50.439299    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 09:55:50.439317    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 09:55:50.439334    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 09:55:50.439423    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 09:55:50.439459    2954 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 09:55:50.439466    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 09:55:50.439503    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 09:55:50.439532    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 09:55:50.439561    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 09:55:50.439623    2954 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 09:55:50.439662    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.439683    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.439702    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.439730    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:55:50.439869    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:55:50.439971    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:55:50.440060    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:55:50.440149    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:55:50.470145    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 09:55:50.473304    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 09:55:50.482843    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 09:55:50.486120    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 09:55:50.495117    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 09:55:50.498266    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 09:55:50.507788    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 09:55:50.510913    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 09:55:50.519933    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 09:55:50.523042    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 09:55:50.531891    2954 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 09:55:50.535096    2954 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 09:55:50.544058    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 09:55:50.564330    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 09:55:50.585250    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 09:55:50.605412    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 09:55:50.625492    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1444 bytes)
	I0731 09:55:50.645935    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 09:55:50.666578    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 09:55:50.686734    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 09:55:50.707428    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 09:55:50.728977    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 09:55:50.749365    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 09:55:50.769217    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 09:55:50.782635    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 09:55:50.796452    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 09:55:50.810265    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 09:55:50.823856    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 09:55:50.837713    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 09:55:50.851806    2954 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 09:55:50.865643    2954 ssh_runner.go:195] Run: openssl version
	I0731 09:55:50.869985    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 09:55:50.878755    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.882092    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.882127    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 09:55:50.886361    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 09:55:50.894800    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 09:55:50.903511    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.906902    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.906941    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 09:55:50.911184    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 09:55:50.919457    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 09:55:50.927999    2954 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.931344    2954 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.931398    2954 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 09:55:50.935641    2954 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 09:55:50.944150    2954 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 09:55:50.947330    2954 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 09:55:50.947373    2954 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 09:55:50.947432    2954 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 09:55:50.947450    2954 kube-vip.go:115] generating kube-vip config ...
	I0731 09:55:50.947488    2954 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 09:55:50.960195    2954 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 09:55:50.960253    2954 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 09:55:50.960307    2954 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 09:55:50.968017    2954 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 09:55:50.968069    2954 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 09:55:50.975489    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:55:50.975475    2954 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 09:55:50.975509    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:55:50.975519    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:55:50.975557    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 09:55:50.976020    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 09:55:50.987294    2954 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:55:50.987330    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 09:55:50.987350    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 09:55:50.987377    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 09:55:50.987399    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 09:55:50.987416    2954 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 09:55:51.010057    2954 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 09:55:51.010100    2954 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 09:55:51.683575    2954 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 09:55:51.690828    2954 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 09:55:51.704403    2954 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 09:55:51.718184    2954 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 09:55:51.732058    2954 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 09:55:51.735039    2954 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 09:55:51.744606    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:55:51.842284    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:55:51.858313    2954 host.go:66] Checking if "ha-393000" exists ...
	I0731 09:55:51.858589    2954 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:55:51.858612    2954 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:55:51.867825    2954 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51159
	I0731 09:55:51.868326    2954 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:55:51.868657    2954 main.go:141] libmachine: Using API Version  1
	I0731 09:55:51.868668    2954 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:55:51.868882    2954 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:55:51.868991    2954 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 09:55:51.869077    2954 start.go:317] joinCluster: &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 Clu
sterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:fals
e inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimi
zations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:55:51.869219    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0731 09:55:51.869241    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 09:55:51.869330    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 09:55:51.869408    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 09:55:51.869497    2954 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 09:55:51.869579    2954 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 09:55:51.957634    2954 start.go:343] trying to join control-plane node "m03" to cluster: &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:55:51.957691    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 3o7i0i.qey1hcj8w6i3nuyy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443"
	I0731 09:56:20.527748    2954 ssh_runner.go:235] Completed: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.3:$PATH" kubeadm join control-plane.minikube.internal:8443 --token 3o7i0i.qey1hcj8w6i3nuyy --discovery-token-ca-cert-hash sha256:5506a5f6d198c42c37994348fba448e494bf859904f9c608af57ab92ae4e8d24 --ignore-preflight-errors=all --cri-socket unix:///var/run/cri-dockerd.sock --node-name=ha-393000-m03 --control-plane --apiserver-advertise-address=192.169.0.7 --apiserver-bind-port=8443": (28.570050327s)
	I0731 09:56:20.527779    2954 ssh_runner.go:195] Run: /bin/bash -c "sudo systemctl daemon-reload && sudo systemctl enable kubelet && sudo systemctl start kubelet"
	I0731 09:56:20.987700    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-393000-m03 minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9 minikube.k8s.io/name=ha-393000 minikube.k8s.io/primary=false
	I0731 09:56:21.064233    2954 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.3/kubectl --kubeconfig=/var/lib/minikube/kubeconfig taint nodes ha-393000-m03 node-role.kubernetes.io/control-plane:NoSchedule-
	I0731 09:56:21.148165    2954 start.go:319] duration metric: took 29.279096383s to joinCluster
	I0731 09:56:21.148219    2954 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 09:56:21.148483    2954 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:56:21.189791    2954 out.go:177] * Verifying Kubernetes components...
	I0731 09:56:21.248129    2954 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 09:56:21.485219    2954 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 09:56:21.507788    2954 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:56:21.508040    2954 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x10f89660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 09:56:21.508088    2954 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 09:56:21.508300    2954 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 09:56:21.508342    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:21.508347    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:21.508353    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:21.508357    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:21.510586    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:22.008706    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:22.008723    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:22.008734    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:22.008738    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:22.010978    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:22.509350    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:22.509366    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:22.509372    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:22.509375    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:22.511656    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:23.009510    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:23.009526    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:23.009532    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:23.009535    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:23.011420    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:23.508500    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:23.508516    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:23.508523    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:23.508526    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:23.510720    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:23.511145    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:24.009377    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:24.009394    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:24.009439    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:24.009443    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:24.011828    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:24.509345    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:24.509361    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:24.509368    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:24.509372    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:24.511614    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:25.009402    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:25.009418    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:25.009424    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:25.009428    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:25.011344    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:25.508774    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:25.508790    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:25.508797    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:25.508800    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:25.510932    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:25.511292    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:26.008449    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:26.008465    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:26.008471    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:26.008474    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:26.010614    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:26.509754    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:26.509786    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:26.509799    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:26.509805    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:26.512347    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:27.008498    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:27.008592    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:27.008608    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:27.008615    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:27.011956    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:27.509028    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:27.509110    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:27.509125    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:27.509132    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:27.512133    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:27.512700    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:28.008990    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:28.009083    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:28.009097    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:28.009103    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:28.012126    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:28.509594    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:28.509612    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:28.509621    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:28.509625    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:28.512206    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:29.009613    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:29.009628    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:29.009634    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:29.009637    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:29.011661    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:29.509044    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:29.509059    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:29.509065    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:29.509068    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:29.511159    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:30.008831    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:30.008905    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:30.008916    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:30.008922    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:30.011246    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:30.011529    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:30.509817    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:30.509832    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:30.509838    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:30.509846    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:30.511920    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:31.008461    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:31.008483    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:31.008493    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:31.008499    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:31.011053    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:31.509184    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:31.509236    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:31.509247    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:31.509252    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:31.511776    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:32.008486    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:32.008510    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:32.008522    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:32.008531    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:32.011649    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:32.012066    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:32.510023    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:32.510037    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:32.510044    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:32.510048    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:32.512097    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:33.010283    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:33.010301    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:33.010310    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:33.010314    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:33.012927    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:33.509693    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:33.509712    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:33.509722    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:33.509726    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:33.512086    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.008568    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:34.008586    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:34.008594    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:34.008599    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:34.010823    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.509266    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:34.509365    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:34.509380    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:34.509386    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:34.512417    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:34.512850    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:35.009777    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:35.009792    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:35.009799    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:35.009802    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:35.011859    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:35.508525    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:35.508582    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:35.508590    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:35.508596    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:35.510810    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:36.009838    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:36.009864    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:36.009876    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:36.009881    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:36.012816    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:36.509201    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:36.509215    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:36.509265    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:36.509269    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:36.511244    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:37.010038    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:37.010064    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:37.010077    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:37.010083    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:37.013339    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:37.013728    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:37.509315    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:37.509330    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:37.509336    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:37.509339    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:37.511753    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:38.009336    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:38.009405    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:38.009415    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:38.009428    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:38.011725    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:38.508458    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:38.508483    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:38.508493    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:38.508500    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:38.511720    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:39.008429    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:39.008452    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:39.008459    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:39.008463    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:39.010408    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:39.508530    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:39.508555    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:39.508569    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:39.508577    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:39.511916    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:39.512435    2954 node_ready.go:53] node "ha-393000-m03" has status "Ready":"False"
	I0731 09:56:40.009629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.009648    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.009663    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.009668    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.011742    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.509939    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.509963    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.509976    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.509982    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.512891    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.513173    2954 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 09:56:40.513182    2954 node_ready.go:38] duration metric: took 19.004877925s for node "ha-393000-m03" to be "Ready" ...
	I0731 09:56:40.513193    2954 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:56:40.513230    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:40.513235    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.513241    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.513244    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.517063    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:40.521698    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.521758    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 09:56:40.521763    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.521769    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.521773    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.524012    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.524507    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.524515    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.524521    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.524525    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.526095    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.526522    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.526532    2954 pod_ready.go:81] duration metric: took 4.820449ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.526539    2954 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.526579    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 09:56:40.526584    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.526589    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.526597    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.528189    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.528737    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.528744    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.528750    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.528754    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.530442    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.530775    2954 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.530784    2954 pod_ready.go:81] duration metric: took 4.239462ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.530790    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.530822    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 09:56:40.530827    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.530833    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.530840    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.532590    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.533050    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:40.533057    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.533062    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.533066    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.534760    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.535110    2954 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.535119    2954 pod_ready.go:81] duration metric: took 4.323936ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.535125    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.535164    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 09:56:40.535170    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.535175    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.535178    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.536947    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.537444    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:40.537451    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.537456    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.537460    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.539136    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:40.539571    2954 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.539580    2954 pod_ready.go:81] duration metric: took 4.45006ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.539587    2954 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.710116    2954 request.go:629] Waited for 170.494917ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 09:56:40.710174    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 09:56:40.710180    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.710187    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.710190    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.712323    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:40.910582    2954 request.go:629] Waited for 197.870555ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.910719    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:40.910732    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:40.910743    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:40.910750    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:40.913867    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:40.914265    2954 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:40.914278    2954 pod_ready.go:81] duration metric: took 374.68494ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:40.914293    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.110758    2954 request.go:629] Waited for 196.414025ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:56:41.110829    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 09:56:41.110835    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.110841    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.110844    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.112890    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:41.311962    2954 request.go:629] Waited for 198.609388ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:41.311995    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:41.312000    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.312006    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.312010    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.314041    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:41.314399    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:41.314410    2954 pod_ready.go:81] duration metric: took 400.109149ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.314418    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.511371    2954 request.go:629] Waited for 196.905615ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:56:41.511497    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 09:56:41.511508    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.511519    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.511526    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.514702    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:41.710099    2954 request.go:629] Waited for 194.801702ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:41.710131    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:41.710137    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.710143    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.710148    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.711902    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:41.712201    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:41.712211    2954 pod_ready.go:81] duration metric: took 397.788368ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.712225    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:41.910472    2954 request.go:629] Waited for 198.191914ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 09:56:41.910629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 09:56:41.910640    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:41.910651    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:41.910657    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:41.913895    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:42.111114    2954 request.go:629] Waited for 196.678487ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:42.111206    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:42.111214    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.111222    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.111228    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.113500    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.113867    2954 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.113876    2954 pod_ready.go:81] duration metric: took 401.646528ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.113883    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.310054    2954 request.go:629] Waited for 196.129077ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:56:42.310144    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 09:56:42.310151    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.310157    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.310161    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.312081    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:42.510104    2954 request.go:629] Waited for 197.491787ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:42.510220    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:42.510230    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.510241    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.510249    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.512958    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.513508    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.513521    2954 pod_ready.go:81] duration metric: took 399.632057ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.513531    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.710421    2954 request.go:629] Waited for 196.851281ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:56:42.710510    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 09:56:42.710517    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.710523    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.710527    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.713018    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.910158    2954 request.go:629] Waited for 196.774024ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:42.910295    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:42.910307    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:42.910319    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:42.910327    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:42.913021    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:42.913406    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:42.913416    2954 pod_ready.go:81] duration metric: took 399.880068ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:42.913423    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.110445    2954 request.go:629] Waited for 196.965043ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 09:56:43.110548    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 09:56:43.110603    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.110615    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.110630    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.113588    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.311083    2954 request.go:629] Waited for 196.925492ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:43.311134    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:43.311139    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.311146    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.311149    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.313184    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.313462    2954 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:43.313472    2954 pod_ready.go:81] duration metric: took 400.04465ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.313479    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.510584    2954 request.go:629] Waited for 197.060501ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:56:43.510710    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 09:56:43.510722    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.510731    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.510737    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.513575    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.710025    2954 request.go:629] Waited for 195.991998ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:43.710104    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:43.710111    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.710117    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.710121    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.712314    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:43.712653    2954 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:43.712663    2954 pod_ready.go:81] duration metric: took 399.178979ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.712670    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:43.910041    2954 request.go:629] Waited for 197.319656ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 09:56:43.910085    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 09:56:43.910092    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:43.910100    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:43.910108    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:43.913033    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.110409    2954 request.go:629] Waited for 196.775647ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:44.110512    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:44.110520    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.110526    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.110530    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.112726    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.113050    2954 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.113060    2954 pod_ready.go:81] duration metric: took 400.385455ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.113067    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.310143    2954 request.go:629] Waited for 197.043092ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:56:44.310236    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 09:56:44.310243    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.310253    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.310258    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.312471    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.510561    2954 request.go:629] Waited for 197.642859ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.510715    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.510728    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.510742    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.510750    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.513815    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:44.514349    2954 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.514363    2954 pod_ready.go:81] duration metric: took 401.290361ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.514372    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.711407    2954 request.go:629] Waited for 196.995177ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:56:44.711475    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 09:56:44.711482    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.711488    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.711491    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.713573    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.910056    2954 request.go:629] Waited for 196.042855ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.910095    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 09:56:44.910103    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:44.910112    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:44.910117    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:44.912608    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:44.912924    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:44.912934    2954 pod_ready.go:81] duration metric: took 398.555138ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:44.912941    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.112001    2954 request.go:629] Waited for 199.012783ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:56:45.112114    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 09:56:45.112125    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.112136    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.112142    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.115328    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:45.310138    2954 request.go:629] Waited for 194.249421ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:45.310197    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 09:56:45.310207    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.310217    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.310226    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.315131    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:45.315432    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:45.315442    2954 pod_ready.go:81] duration metric: took 402.495485ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.315449    2954 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.510510    2954 request.go:629] Waited for 195.017136ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 09:56:45.510595    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 09:56:45.510601    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.510607    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.510614    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.512663    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:45.709970    2954 request.go:629] Waited for 196.900157ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:45.710056    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 09:56:45.710063    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.710069    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.710073    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.712279    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:45.712540    2954 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 09:56:45.712550    2954 pod_ready.go:81] duration metric: took 397.095893ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 09:56:45.712557    2954 pod_ready.go:38] duration metric: took 5.199358243s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 09:56:45.712568    2954 api_server.go:52] waiting for apiserver process to appear ...
	I0731 09:56:45.712620    2954 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 09:56:45.724210    2954 api_server.go:72] duration metric: took 24.575970869s to wait for apiserver process to appear ...
	I0731 09:56:45.724224    2954 api_server.go:88] waiting for apiserver healthz status ...
	I0731 09:56:45.724236    2954 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 09:56:45.729801    2954 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 09:56:45.729848    2954 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 09:56:45.729855    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.729862    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.729867    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.731097    2954 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 09:56:45.731132    2954 api_server.go:141] control plane version: v1.30.3
	I0731 09:56:45.731141    2954 api_server.go:131] duration metric: took 6.912618ms to wait for apiserver health ...
	I0731 09:56:45.731147    2954 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 09:56:45.910423    2954 request.go:629] Waited for 179.236536ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:45.910520    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:45.910529    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:45.910537    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:45.910541    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:45.914926    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:45.919715    2954 system_pods.go:59] 24 kube-system pods found
	I0731 09:56:45.919728    2954 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:56:45.919732    2954 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:56:45.919735    2954 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:56:45.919738    2954 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:56:45.919742    2954 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 09:56:45.919745    2954 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:56:45.919748    2954 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:56:45.919750    2954 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 09:56:45.919753    2954 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:56:45.919756    2954 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:56:45.919759    2954 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 09:56:45.919761    2954 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:56:45.919764    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:56:45.919767    2954 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 09:56:45.919770    2954 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:56:45.919773    2954 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 09:56:45.919776    2954 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:56:45.919778    2954 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:56:45.919780    2954 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:56:45.919783    2954 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 09:56:45.919785    2954 system_pods.go:61] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:56:45.919789    2954 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:56:45.919792    2954 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 09:56:45.919795    2954 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:56:45.919799    2954 system_pods.go:74] duration metric: took 188.647794ms to wait for pod list to return data ...
	I0731 09:56:45.919808    2954 default_sa.go:34] waiting for default service account to be created ...
	I0731 09:56:46.110503    2954 request.go:629] Waited for 190.648848ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:56:46.110629    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 09:56:46.110641    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.110653    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.110659    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.113864    2954 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 09:56:46.113948    2954 default_sa.go:45] found service account: "default"
	I0731 09:56:46.113959    2954 default_sa.go:55] duration metric: took 194.145984ms for default service account to be created ...
	I0731 09:56:46.113966    2954 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 09:56:46.310339    2954 request.go:629] Waited for 196.331355ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:46.310381    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 09:56:46.310387    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.310420    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.310424    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.314581    2954 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 09:56:46.318894    2954 system_pods.go:86] 24 kube-system pods found
	I0731 09:56:46.318910    2954 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 09:56:46.318914    2954 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 09:56:46.318918    2954 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 09:56:46.318921    2954 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 09:56:46.318926    2954 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 09:56:46.318931    2954 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 09:56:46.318934    2954 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 09:56:46.318939    2954 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 09:56:46.318942    2954 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 09:56:46.318946    2954 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 09:56:46.318950    2954 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 09:56:46.318955    2954 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 09:56:46.318958    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 09:56:46.318963    2954 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 09:56:46.318966    2954 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 09:56:46.318970    2954 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 09:56:46.318973    2954 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 09:56:46.318976    2954 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 09:56:46.318980    2954 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 09:56:46.318983    2954 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 09:56:46.318987    2954 system_pods.go:89] "kube-vip-ha-393000" [14e8ed26-a49f-4cfc-9f86-11b800931cc6] Running
	I0731 09:56:46.318990    2954 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 09:56:46.318993    2954 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 09:56:46.318996    2954 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 09:56:46.319002    2954 system_pods.go:126] duration metric: took 205.029246ms to wait for k8s-apps to be running ...
	I0731 09:56:46.319007    2954 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 09:56:46.319063    2954 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 09:56:46.330197    2954 system_svc.go:56] duration metric: took 11.183343ms WaitForService to wait for kubelet
	I0731 09:56:46.330213    2954 kubeadm.go:582] duration metric: took 25.181975511s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 09:56:46.330225    2954 node_conditions.go:102] verifying NodePressure condition ...
	I0731 09:56:46.509976    2954 request.go:629] Waited for 179.711714ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 09:56:46.510033    2954 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 09:56:46.510039    2954 round_trippers.go:469] Request Headers:
	I0731 09:56:46.510045    2954 round_trippers.go:473]     Accept: application/json, */*
	I0731 09:56:46.510049    2954 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 09:56:46.512677    2954 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 09:56:46.513343    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513352    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513358    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513361    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513364    2954 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 09:56:46.513367    2954 node_conditions.go:123] node cpu capacity is 2
	I0731 09:56:46.513371    2954 node_conditions.go:105] duration metric: took 183.142994ms to run NodePressure ...
	I0731 09:56:46.513378    2954 start.go:241] waiting for startup goroutines ...
	I0731 09:56:46.513392    2954 start.go:255] writing updated cluster config ...
	I0731 09:56:46.513784    2954 ssh_runner.go:195] Run: rm -f paused
	I0731 09:56:46.555311    2954 start.go:600] kubectl: 1.29.2, cluster: 1.30.3 (minor skew: 1)
	I0731 09:56:46.577040    2954 out.go:177] * Done! kubectl is now configured to use "ha-393000" cluster and "default" namespace by default
	
	
	==> Docker <==
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/25b3d6db405f49d365d6f33539e94ee4547921a7d0c463b94585056341530cda/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/c2a288a20831d0407ed1a2c3eeeb19a9758ef98813b916541258c8c58bcce38c/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:54:25Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/480020f5f9c0ce2e553e007beff5dfbe53b17bd2beaa73039be50701f04b9e76/resolv.conf as [nameserver 192.169.0.1]"
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428712215Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428950502Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.428960130Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.429078581Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477484798Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477564679Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477577219Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.477869035Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507078466Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507147792Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507166914Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:54:25 ha-393000 dockerd[1282]: time="2024-07-31T16:54:25.507244276Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853207982Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853706000Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.853772518Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 dockerd[1282]: time="2024-07-31T16:56:47.854059851Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:47 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:56:47Z" level=info msg="Will attempt to re-write config file /var/lib/docker/containers/e9ce137a2245c1333d3f3961469d32237e88656784f689211ed86cae2fd5518f/resolv.conf as [nameserver 10.96.0.10 search default.svc.cluster.local svc.cluster.local cluster.local options ndots:5]"
	Jul 31 16:56:49 ha-393000 cri-dockerd[1170]: time="2024-07-31T16:56:49Z" level=info msg="Stop pulling image gcr.io/k8s-minikube/busybox:1.28: Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:1.28"
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157487366Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157549945Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.157563641Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 16:56:49 ha-393000 dockerd[1282]: time="2024-07-31T16:56:49.158058722Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   3 minutes ago       Running             busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	6d966e37d3618       6e38f40d628db                                                                                         5 minutes ago       Running             storage-provisioner       0                   25b3d6db405f4       storage-provisioner
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              5 minutes ago       Running             kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         6 minutes ago       Running             kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	e68314e525ef8       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     6 minutes ago       Running             kube-vip                  0                   c9f21d49b1384       kube-vip-ha-393000
	ab4f453cbe097       1f6d574d502f3                                                                                         6 minutes ago       Running             kube-apiserver            0                   7dc7f319faa98       kube-apiserver-ha-393000
	63e56744c84ee       3861cfcd7c04c                                                                                         6 minutes ago       Running             etcd                      0                   f8f20b1290499       etcd-ha-393000
	e19f7878939c9       76932a3b37d7e                                                                                         6 minutes ago       Running             kube-controller-manager   0                   67c995d2d2a3b       kube-controller-manager-ha-393000
	65412448c586b       3edc18e7b7672                                                                                         6 minutes ago       Running             kube-scheduler            0                   7ab9affa89eca       kube-scheduler-ha-393000
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:34336 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000091143s
	[INFO] 10.244.2.2:60404 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000085158s
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	
	
	==> coredns [feda36fb8a03] <==
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:43418 - 53237 "HINFO IN 5926041632293031093.721085148118182160. udp 56 false 512" NXDOMAIN qr,rd,ra 131 0.013101738s
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	
	
	==> describe nodes <==
	Name:               ha-393000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:53:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:00:00 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:56:56 +0000   Wed, 31 Jul 2024 16:54:24 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-393000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 baf02d554c20474b9fadb280fa1b8544
	  System UUID:                2cfe48dd-0000-0000-9b98-537ad9823a95
	  Boot ID:                    d6aa7e74-2f58-4a9d-a5df-37153dda8239
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-b94zr              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m20s
	  kube-system                 coredns-7db6d8ff4d-5m8st             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     6m2s
	  kube-system                 coredns-7db6d8ff4d-wvqjl             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     6m2s
	  kube-system                 etcd-ha-393000                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         6m16s
	  kube-system                 kindnet-hjm7c                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      6m2s
	  kube-system                 kube-apiserver-ha-393000             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m16s
	  kube-system                 kube-controller-manager-ha-393000    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m17s
	  kube-system                 kube-proxy-zc52f                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m2s
	  kube-system                 kube-scheduler-ha-393000             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m16s
	  kube-system                 kube-vip-ha-393000                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m19s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m1s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 6m1s   kube-proxy       
	  Normal  Starting                 6m16s  kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  6m16s  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  6m16s  kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    6m16s  kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     6m16s  kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           6m3s   node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeReady                5m43s  kubelet          Node ha-393000 status is now: NodeReady
	  Normal  RegisteredNode           4m44s  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           3m32s  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           60s    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	
	
	Name:               ha-393000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:55:06 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:00:01 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:58:50 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:58:50 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:58:50 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:58:50 +0000   Wed, 31 Jul 2024 16:55:27 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-393000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 8cad1e8438b9433a8e44de3ee4ad2c7a
	  System UUID:                7863443c-0000-0000-8e8d-bbd47bc06547
	  Boot ID:                    febe9487-cc37-4f76-a943-4c3bd5898a28
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-zln22                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m20s
	  kube-system                 etcd-ha-393000-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         4m59s
	  kube-system                 kindnet-lcwbs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      5m1s
	  kube-system                 kube-apiserver-ha-393000-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m59s
	  kube-system                 kube-controller-manager-ha-393000-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m59s
	  kube-system                 kube-proxy-cf577                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m1s
	  kube-system                 kube-scheduler-ha-393000-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m59s
	  kube-system                 kube-vip-ha-393000-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m57s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                  From             Message
	  ----     ------                   ----                 ----             -------
	  Normal   Starting                 4m57s                kube-proxy       
	  Normal   Starting                 74s                  kube-proxy       
	  Normal   NodeHasSufficientMemory  5m1s (x8 over 5m1s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m1s (x8 over 5m1s)  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m1s (x7 over 5m1s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  5m1s                 kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           4m58s                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           4m44s                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           3m32s                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   Starting                 77s                  kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  77s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  77s                  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    77s                  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     77s                  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 77s                  kubelet          Node ha-393000-m02 has been rebooted, boot id: febe9487-cc37-4f76-a943-4c3bd5898a28
	  Normal   RegisteredNode           60s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	
	
	Name:               ha-393000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:56:18 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:00:02 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 16:57:19 +0000   Wed, 31 Jul 2024 16:56:40 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-393000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 86f4bf9242d1461e9aec7b900dfd2277
	  System UUID:                451d42a6-0000-0000-8ccb-b8851dda0594
	  Boot ID:                    07f25a3c-b688-461e-9d49-0a60051d0c3c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-n8d7h                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m20s
	  kube-system                 etcd-ha-393000-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         3m47s
	  kube-system                 kindnet-s2pv6                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      3m49s
	  kube-system                 kube-apiserver-ha-393000-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m48s
	  kube-system                 kube-controller-manager-ha-393000-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m47s
	  kube-system                 kube-proxy-cr9pg                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m49s
	  kube-system                 kube-scheduler-ha-393000-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m48s
	  kube-system                 kube-vip-ha-393000-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         3m45s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 3m45s                  kube-proxy       
	  Normal  NodeHasSufficientMemory  3m49s (x8 over 3m49s)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    3m49s (x8 over 3m49s)  kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     3m49s (x7 over 3m49s)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  3m49s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           3m48s                  node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           3m44s                  node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           3m32s                  node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal  RegisteredNode           60s                    node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	
	
	==> dmesg <==
	[  +2.764750] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.236579] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000003] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.776173] systemd-fstab-generator[496]: Ignoring "noauto" option for root device
	[  +0.099418] systemd-fstab-generator[508]: Ignoring "noauto" option for root device
	[  +1.822617] systemd-fstab-generator[843]: Ignoring "noauto" option for root device
	[  +0.280031] systemd-fstab-generator[881]: Ignoring "noauto" option for root device
	[  +0.062769] kauditd_printk_skb: 95 callbacks suppressed
	[  +0.051458] systemd-fstab-generator[893]: Ignoring "noauto" option for root device
	[  +0.120058] systemd-fstab-generator[907]: Ignoring "noauto" option for root device
	[  +2.468123] systemd-fstab-generator[1123]: Ignoring "noauto" option for root device
	[  +0.099873] systemd-fstab-generator[1135]: Ignoring "noauto" option for root device
	[  +0.092257] systemd-fstab-generator[1147]: Ignoring "noauto" option for root device
	[  +0.106918] systemd-fstab-generator[1162]: Ignoring "noauto" option for root device
	[  +3.770701] systemd-fstab-generator[1268]: Ignoring "noauto" option for root device
	[  +0.056009] kauditd_printk_skb: 180 callbacks suppressed
	[  +2.552095] systemd-fstab-generator[1523]: Ignoring "noauto" option for root device
	[  +4.084188] systemd-fstab-generator[1702]: Ignoring "noauto" option for root device
	[  +0.054525] kauditd_printk_skb: 70 callbacks suppressed
	[  +7.033653] systemd-fstab-generator[2202]: Ignoring "noauto" option for root device
	[  +0.072815] kauditd_printk_skb: 72 callbacks suppressed
	[Jul31 16:54] kauditd_printk_skb: 12 callbacks suppressed
	[ +19.132251] kauditd_printk_skb: 38 callbacks suppressed
	[Jul31 16:55] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [63e56744c84e] <==
	{"level":"warn","ts":"2024-07-31T16:58:42.125472Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"1c40d7bfcdf14e3b","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T16:58:46.127349Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.6:2380/version","remote-member-id":"1c40d7bfcdf14e3b","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T16:58:46.1274Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"1c40d7bfcdf14e3b","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T16:58:46.546822Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"781.802µs","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T16:58:46.548025Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"6.213273ms","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T16:58:50.128777Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.6:2380/version","remote-member-id":"1c40d7bfcdf14e3b","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T16:58:50.128941Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"1c40d7bfcdf14e3b","error":"Get \"https://192.169.0.6:2380/version\": dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T16:58:51.547336Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"781.802µs","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T16:58:51.548436Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"6.213273ms","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"info","ts":"2024-07-31T16:58:52.097121Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"1c40d7bfcdf14e3b"}
	{"level":"warn","ts":"2024-07-31T16:58:52.192069Z","caller":"rafthttp/peer_status.go:66","msg":"peer became inactive (message send to peer failed)","peer-id":"1c40d7bfcdf14e3b","error":"failed to dial 1c40d7bfcdf14e3b on stream Message (peer 1c40d7bfcdf14e3b failed to find local node b8c6c7563d17d844)"}
	{"level":"info","ts":"2024-07-31T16:58:52.195125Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"1c40d7bfcdf14e3b"}
	{"level":"warn","ts":"2024-07-31T16:58:52.195204Z","caller":"rafthttp/peer_status.go:66","msg":"peer became inactive (message send to peer failed)","peer-id":"1c40d7bfcdf14e3b","error":"failed to dial 1c40d7bfcdf14e3b on stream MsgApp v2 (peer 1c40d7bfcdf14e3b failed to find local node b8c6c7563d17d844)"}
	{"level":"info","ts":"2024-07-31T16:58:52.195338Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T16:58:52.24112Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"1c40d7bfcdf14e3b","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-07-31T16:58:52.241219Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T16:58:52.320086Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"1c40d7bfcdf14e3b","stream-type":"stream Message"}
	{"level":"info","ts":"2024-07-31T16:58:52.32031Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T16:58:52.320749Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T16:58:52.348646Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"warn","ts":"2024-07-31T16:58:52.353081Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"192.169.0.6:53416","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-07-31T16:58:52.359714Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"192.169.0.6:53440","server-name":"","error":"read tcp 192.169.0.5:2380->192.169.0.6:53440: read: connection reset by peer"}
	{"level":"warn","ts":"2024-07-31T16:58:52.359801Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"192.169.0.6:53436","server-name":"","error":"read tcp 192.169.0.5:2380->192.169.0.6:53436: read: connection reset by peer"}
	{"level":"warn","ts":"2024-07-31T16:58:52.383169Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"192.169.0.6:53452","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-07-31T16:58:52.383955Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"192.169.0.6:53464","server-name":"","error":"EOF"}
	
	
	==> kernel <==
	 17:00:08 up 6 min,  0 users,  load average: 0.29, 0.40, 0.20
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:59:20.117329       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:30.110067       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:30.110281       1 main.go:299] handling current node
	I0731 16:59:30.110481       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:30.110625       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:30.110918       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:30.111059       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:40.110502       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:40.110540       1 main.go:299] handling current node
	I0731 16:59:40.110552       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:40.110557       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:40.110670       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:40.110698       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:50.118349       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:50.118427       1 main.go:299] handling current node
	I0731 16:59:50.118450       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:50.118464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:50.118651       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:50.118739       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.118883       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:00.118987       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:00.119126       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:00.119236       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.119356       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:00.119483       1 main.go:299] handling current node
	
	
	==> kube-apiserver [ab4f453cbe09] <==
	I0731 16:53:49.787246       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0731 16:53:49.838971       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0731 16:53:49.842649       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.169.0.5]
	I0731 16:53:49.843317       1 controller.go:615] quota admission added evaluator for: endpoints
	I0731 16:53:49.845885       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0731 16:53:50.451090       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0731 16:53:51.578858       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0731 16:53:51.587918       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0731 16:53:51.594571       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0731 16:54:05.505988       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	I0731 16:54:05.655031       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	E0731 16:56:52.014947       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51195: use of closed network connection
	E0731 16:56:52.206354       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51197: use of closed network connection
	E0731 16:56:52.403109       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51199: use of closed network connection
	E0731 16:56:52.600256       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51201: use of closed network connection
	E0731 16:56:52.785054       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51203: use of closed network connection
	E0731 16:56:53.004706       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51205: use of closed network connection
	E0731 16:56:53.208399       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51207: use of closed network connection
	E0731 16:56:53.392187       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51209: use of closed network connection
	E0731 16:56:53.714246       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51212: use of closed network connection
	E0731 16:56:53.895301       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51214: use of closed network connection
	E0731 16:56:54.078794       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51216: use of closed network connection
	E0731 16:56:54.262767       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51218: use of closed network connection
	E0731 16:56:54.448344       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51220: use of closed network connection
	E0731 16:56:54.629926       1 conn.go:339] Error on socket receive: read tcp 192.169.0.254:8443->192.169.0.1:51222: use of closed network connection
	
	
	==> kube-controller-manager [e19f7878939c] <==
	I0731 16:55:09.814349       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m02"
	E0731 16:56:18.277948       1 certificate_controller.go:146] Sync csr-v42tm failed with : error updating signature for csr: Operation cannot be fulfilled on certificatesigningrequests.certificates.k8s.io "csr-v42tm": the object has been modified; please apply your changes to the latest version and try again
	I0731 16:56:18.384134       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m03\" does not exist"
	I0731 16:56:18.398095       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m03" podCIDRs=["10.244.2.0/24"]
	I0731 16:56:19.822872       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m03"
	I0731 16:56:47.522324       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="152.941157ms"
	I0731 16:56:47.574976       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="52.539469ms"
	I0731 16:56:47.678922       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="103.895055ms"
	I0731 16:56:47.701560       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="22.534098ms"
	I0731 16:56:47.701787       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="74.391µs"
	I0731 16:56:47.718186       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="8.079697ms"
	I0731 16:56:47.718269       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="39.867µs"
	I0731 16:56:47.744772       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="8.73015ms"
	I0731 16:56:47.745065       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="34.302µs"
	I0731 16:56:48.288860       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.605µs"
	I0731 16:56:49.532986       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.769402ms"
	I0731 16:56:49.533229       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="37.061µs"
	I0731 16:56:49.677499       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.411426ms"
	I0731 16:56:49.677560       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="21.894µs"
	I0731 16:56:51.343350       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="15.340858ms"
	I0731 16:56:51.343434       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.532µs"
	I0731 16:58:51.645474       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="12.458022ms"
	I0731 16:58:51.645543       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="24.514µs"
	I0731 16:58:54.292579       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="38.328144ms"
	I0731 16:58:54.292702       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.709µs"
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [65412448c586] <==
	W0731 16:53:48.491080       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0731 16:53:48.491132       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0731 16:53:48.491335       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:48.491387       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0731 16:53:48.491507       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0731 16:53:48.491594       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0731 16:53:48.491662       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:48.491738       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:48.491818       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:48.491860       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:48.491537       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:48.491873       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.319781       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0731 16:53:49.319838       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0731 16:53:49.326442       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.326478       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.392116       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:49.392172       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:49.496014       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.496036       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.541411       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:49.541927       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:49.588695       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:49.588735       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0731 16:53:49.982415       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 31 16:55:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:55:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.510532    2209 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-wvqjl" podStartSLOduration=162.510247367 podStartE2EDuration="2m42.510247367s" podCreationTimestamp="2024-07-31 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-31 16:54:25.761498183 +0000 UTC m=+33.433614594" watchObservedRunningTime="2024-07-31 16:56:47.510247367 +0000 UTC m=+175.182363776"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.510944    2209 topology_manager.go:215] "Topology Admit Handler" podUID="dd382c29-63af-44cb-bf5b-b7db27f11017" podNamespace="default" podName="busybox-fc5497c4f-b94zr"
	Jul 31 16:56:47 ha-393000 kubelet[2209]: I0731 16:56:47.640155    2209 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp8k4\" (UniqueName: \"kubernetes.io/projected/dd382c29-63af-44cb-bf5b-b7db27f11017-kube-api-access-cp8k4\") pod \"busybox-fc5497c4f-b94zr\" (UID: \"dd382c29-63af-44cb-bf5b-b7db27f11017\") " pod="default/busybox-fc5497c4f-b94zr"
	Jul 31 16:56:52 ha-393000 kubelet[2209]: E0731 16:56:52.472632    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:56:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:56:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:57:52 ha-393000 kubelet[2209]: E0731 16:57:52.468077    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:57:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:57:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:58:52 ha-393000 kubelet[2209]: E0731 16:58:52.470515    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:58:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:58:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:58:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:58:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 16:59:52 ha-393000 kubelet[2209]: E0731 16:59:52.469154    2209 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 16:59:52 ha-393000 kubelet[2209]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 16:59:52 ha-393000 kubelet[2209]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 16:59:52 ha-393000 kubelet[2209]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 16:59:52 ha-393000 kubelet[2209]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:261: (dbg) Run:  kubectl --context ha-393000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/RestartSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/RestartSecondaryNode (99.36s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (126.35s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:456: (dbg) Run:  out/minikube-darwin-amd64 node list -p ha-393000 -v=7 --alsologtostderr
ha_test.go:462: (dbg) Run:  out/minikube-darwin-amd64 stop -p ha-393000 -v=7 --alsologtostderr
ha_test.go:462: (dbg) Done: out/minikube-darwin-amd64 stop -p ha-393000 -v=7 --alsologtostderr: (27.091090198s)
ha_test.go:467: (dbg) Run:  out/minikube-darwin-amd64 start -p ha-393000 --wait=true -v=7 --alsologtostderr
E0731 10:01:54.504509    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
ha_test.go:467: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p ha-393000 --wait=true -v=7 --alsologtostderr: exit status 90 (1m36.651913566s)

                                                
                                                
-- stdout --
	* [ha-393000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	* Restarting existing hyperkit VM for "ha-393000" ...
	* Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	* Enabled addons: 
	
	* Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	* Restarting existing hyperkit VM for "ha-393000-m02" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:00:36.495656    3673 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:00:36.495846    3673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:36.495851    3673 out.go:304] Setting ErrFile to fd 2...
	I0731 10:00:36.495855    3673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:36.496034    3673 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:00:36.497502    3673 out.go:298] Setting JSON to false
	I0731 10:00:36.520791    3673 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1806,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:00:36.520876    3673 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:00:36.543180    3673 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:00:36.586807    3673 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:00:36.586859    3673 notify.go:220] Checking for updates...
	I0731 10:00:36.634547    3673 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:36.676763    3673 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:00:36.720444    3673 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:00:36.764479    3673 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:00:36.807390    3673 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:00:36.829325    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:36.829489    3673 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:00:36.830157    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.830242    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:36.839843    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51805
	I0731 10:00:36.840174    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:36.840633    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:36.840652    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:36.840857    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:36.840969    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:36.869457    3673 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:00:36.911576    3673 start.go:297] selected driver: hyperkit
	I0731 10:00:36.911605    3673 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:fals
e efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p20
00.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:36.911854    3673 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:00:36.912050    3673 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:00:36.912259    3673 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:00:36.921863    3673 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:00:36.925765    3673 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.925786    3673 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:00:36.929097    3673 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:00:36.929172    3673 cni.go:84] Creating CNI manager for ""
	I0731 10:00:36.929182    3673 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:00:36.929256    3673 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:36.929358    3673 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:00:36.971579    3673 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:00:36.992648    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:36.992745    3673 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:00:36.992770    3673 cache.go:56] Caching tarball of preloaded images
	I0731 10:00:36.992959    3673 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:00:36.992977    3673 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:00:36.993162    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:36.993985    3673 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:00:36.994105    3673 start.go:364] duration metric: took 96.51µs to acquireMachinesLock for "ha-393000"
	I0731 10:00:36.994138    3673 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:00:36.994154    3673 fix.go:54] fixHost starting: 
	I0731 10:00:36.994597    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.994672    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:37.003590    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51807
	I0731 10:00:37.003945    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:37.004312    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:37.004326    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:37.004582    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:37.004711    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:37.004817    3673 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:00:37.004901    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.004979    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 10:00:37.005943    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 2965 missing from process table
	I0731 10:00:37.005965    3673 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:00:37.005979    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:00:37.006061    3673 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:00:37.048650    3673 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:00:37.069570    3673 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:00:37.069827    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.069877    3673 main.go:141] libmachine: (ha-393000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:00:37.071916    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 2965 missing from process table
	I0731 10:00:37.071929    3673 main.go:141] libmachine: (ha-393000) DBG | pid 2965 is in state "Stopped"
	I0731 10:00:37.071946    3673 main.go:141] libmachine: (ha-393000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid...
	I0731 10:00:37.072316    3673 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:00:37.238669    3673 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:00:37.238692    3673 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:00:37.238840    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c1020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:37.238867    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c1020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:37.238912    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:00:37.238957    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:00:37.238973    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:00:37.240553    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Pid is 3685
	I0731 10:00:37.240991    3673 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:00:37.241011    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.241081    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:00:37.243087    3673 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:00:37.243212    3673 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:00:37.243230    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:00:37.243264    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbebe}
	I0731 10:00:37.243290    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:00:37.243299    3673 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:00:37.243315    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 10:00:37.243326    3673 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:00:37.243339    3673 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:00:37.243975    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:37.244206    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:37.244644    3673 machine.go:94] provisionDockerMachine start ...
	I0731 10:00:37.244655    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:37.244765    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:37.244869    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:37.244966    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:37.245080    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:37.245168    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:37.245280    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:37.245490    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:37.245498    3673 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:00:37.248660    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:00:37.300309    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:00:37.301000    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:37.301012    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:37.301019    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:37.301029    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:37.684614    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:00:37.684630    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:00:37.799569    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:37.799605    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:37.799644    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:37.799683    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:37.800441    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:00:37.800452    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:00:43.367703    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:00:43.367775    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:00:43.367785    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:00:43.391726    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:00:47.223864    3673 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:00:50.289047    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:00:50.289062    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.289203    3673 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:00:50.289213    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.289308    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.289398    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.289487    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.289585    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.289691    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.289830    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.289999    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.290007    3673 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:00:50.363752    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:00:50.363772    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.363906    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.364014    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.364093    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.364179    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.364291    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.364432    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.364443    3673 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:00:50.433878    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:00:50.433898    3673 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:00:50.433922    3673 buildroot.go:174] setting up certificates
	I0731 10:00:50.433931    3673 provision.go:84] configureAuth start
	I0731 10:00:50.433938    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.434079    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:50.434192    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.434286    3673 provision.go:143] copyHostCerts
	I0731 10:00:50.434327    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:00:50.434402    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:00:50.434411    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:00:50.434544    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:00:50.434743    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:00:50.434783    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:00:50.434794    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:00:50.434879    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:00:50.435018    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:00:50.435058    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:00:50.435063    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:00:50.435178    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:00:50.435321    3673 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:00:50.506730    3673 provision.go:177] copyRemoteCerts
	I0731 10:00:50.506778    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:00:50.506790    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.506910    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.507000    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.507081    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.507175    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:50.545550    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:00:50.545628    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:00:50.565303    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:00:50.565359    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:00:50.584957    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:00:50.585022    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:00:50.604884    3673 provision.go:87] duration metric: took 170.940154ms to configureAuth
	I0731 10:00:50.604897    3673 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:00:50.605065    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:50.605078    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:50.605206    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.605298    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.605377    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.605465    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.605532    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.605631    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.605760    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.605768    3673 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:00:50.667182    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:00:50.667193    3673 buildroot.go:70] root file system type: tmpfs
	I0731 10:00:50.667267    3673 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:00:50.667284    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.667424    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.667511    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.667602    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.667687    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.667814    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.668002    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.668046    3673 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:00:50.740959    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:00:50.740978    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.741112    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.741203    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.741293    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.741371    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.741505    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.741655    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.741668    3673 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:00:52.502025    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:00:52.502040    3673 machine.go:97] duration metric: took 15.257390709s to provisionDockerMachine
	I0731 10:00:52.502051    3673 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:00:52.502059    3673 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:00:52.502069    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.502248    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:00:52.502270    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.502370    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.502470    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.502555    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.502643    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.547198    3673 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:00:52.551739    3673 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:00:52.551756    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:00:52.551866    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:00:52.552049    3673 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:00:52.552056    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:00:52.552268    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:00:52.560032    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:00:52.588419    3673 start.go:296] duration metric: took 86.358285ms for postStartSetup
	I0731 10:00:52.588445    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.588627    3673 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:00:52.588639    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.588726    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.588826    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.588924    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.589019    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.626220    3673 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:00:52.626280    3673 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:00:52.679056    3673 fix.go:56] duration metric: took 15.684908032s for fixHost
	I0731 10:00:52.679077    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.679217    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.679309    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.679414    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.679512    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.679640    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:52.679796    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:52.679804    3673 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0731 10:00:52.743374    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445252.855355235
	
	I0731 10:00:52.743388    3673 fix.go:216] guest clock: 1722445252.855355235
	I0731 10:00:52.743395    3673 fix.go:229] Guest: 2024-07-31 10:00:52.855355235 -0700 PDT Remote: 2024-07-31 10:00:52.679067 -0700 PDT m=+16.220078068 (delta=176.288235ms)
	I0731 10:00:52.743428    3673 fix.go:200] guest clock delta is within tolerance: 176.288235ms
	I0731 10:00:52.743433    3673 start.go:83] releasing machines lock for "ha-393000", held for 15.749318892s
	I0731 10:00:52.743452    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.743591    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:52.743689    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.743983    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.744104    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.744194    3673 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:00:52.744225    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.744256    3673 ssh_runner.go:195] Run: cat /version.json
	I0731 10:00:52.744267    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.744309    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.744357    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.744392    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.744456    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.744481    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.744548    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.744567    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.744621    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.782140    3673 ssh_runner.go:195] Run: systemctl --version
	I0731 10:00:52.830138    3673 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:00:52.835167    3673 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:00:52.835215    3673 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:00:52.850271    3673 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:00:52.850283    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:00:52.850381    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:00:52.866468    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:00:52.875484    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:00:52.884391    3673 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:00:52.884442    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:00:52.893278    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:00:52.902262    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:00:52.911487    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:00:52.920398    3673 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:00:52.929249    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:00:52.938174    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:00:52.947153    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:00:52.956120    3673 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:00:52.964110    3673 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:00:52.972137    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:53.078103    3673 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:00:53.097675    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:00:53.097756    3673 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:00:53.112446    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:00:53.127406    3673 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:00:53.144131    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:00:53.155196    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:00:53.165334    3673 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:00:53.191499    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:00:53.201932    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:00:53.215879    3673 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:00:53.218826    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:00:53.226091    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:00:53.239772    3673 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:00:53.345187    3673 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:00:53.464204    3673 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:00:53.464287    3673 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:00:53.478304    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:53.574350    3673 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:00:55.900933    3673 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.32656477s)
	I0731 10:00:55.900999    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:00:55.912322    3673 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:00:55.925824    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:00:55.936801    3673 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:00:56.031696    3673 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:00:56.126413    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.241279    3673 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:00:56.258174    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:00:56.269382    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.365935    3673 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:00:56.430017    3673 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:00:56.430103    3673 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:00:56.435941    3673 start.go:563] Will wait 60s for crictl version
	I0731 10:00:56.435987    3673 ssh_runner.go:195] Run: which crictl
	I0731 10:00:56.439039    3673 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:00:56.464351    3673 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:00:56.464431    3673 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:00:56.482582    3673 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:00:56.520920    3673 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:00:56.520980    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:56.521425    3673 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:00:56.525996    3673 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:00:56.535672    3673 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:00:56.535754    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:56.535821    3673 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:00:56.552650    3673 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:00:56.552662    3673 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:00:56.552737    3673 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:00:56.566645    3673 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:00:56.566662    3673 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:00:56.566671    3673 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:00:56.566751    3673 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:00:56.566818    3673 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:00:56.602960    3673 cni.go:84] Creating CNI manager for ""
	I0731 10:00:56.602973    3673 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:00:56.602987    3673 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:00:56.603002    3673 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:00:56.603093    3673 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:00:56.603113    3673 kube-vip.go:115] generating kube-vip config ...
	I0731 10:00:56.603160    3673 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:00:56.615328    3673 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:00:56.615398    3673 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:00:56.615448    3673 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:00:56.624876    3673 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:00:56.624925    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:00:56.632950    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:00:56.646343    3673 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:00:56.659992    3673 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:00:56.674079    3673 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:00:56.687863    3673 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:00:56.690766    3673 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:00:56.700932    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.802080    3673 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:00:56.817267    3673 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:00:56.817279    3673 certs.go:194] generating shared ca certs ...
	I0731 10:00:56.817290    3673 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.817479    3673 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:00:56.817554    3673 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:00:56.817564    3673 certs.go:256] generating profile certs ...
	I0731 10:00:56.817680    3673 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:00:56.817703    3673 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:00:56.817718    3673 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0731 10:00:56.884314    3673 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e ...
	I0731 10:00:56.884330    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e: {Name:mk4c6f4a11277f3afefbfb19687b3bc0d7252c4a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.884782    3673 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e ...
	I0731 10:00:56.884793    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e: {Name:mk5943238cbce29d53e24742be6a5a17eba24882 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.885016    3673 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 10:00:56.885227    3673 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 10:00:56.885483    3673 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:00:56.885493    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:00:56.885517    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:00:56.885538    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:00:56.885559    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:00:56.885578    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:00:56.885598    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:00:56.885616    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:00:56.885638    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:00:56.885737    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:00:56.885783    3673 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:00:56.885791    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:00:56.885821    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:00:56.885850    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:00:56.885884    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:00:56.885950    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:00:56.885985    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:00:56.886006    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:56.886025    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:00:56.886457    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:00:56.908157    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:00:56.929396    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:00:56.960758    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:00:56.990308    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:00:57.032527    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:00:57.067819    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:00:57.106993    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:00:57.128481    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:00:57.148005    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:00:57.167023    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:00:57.187176    3673 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:00:57.200671    3673 ssh_runner.go:195] Run: openssl version
	I0731 10:00:57.204930    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:00:57.213254    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.216716    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.216751    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.221199    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:00:57.229492    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:00:57.237806    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.241238    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.241272    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.245487    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:00:57.253849    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:00:57.262176    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.265564    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.265596    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.269893    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:00:57.278252    3673 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:00:57.281742    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:00:57.286355    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:00:57.290726    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:00:57.295185    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:00:57.299486    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:00:57.303679    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:00:57.308178    3673 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:57.308287    3673 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:00:57.320531    3673 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:00:57.328175    3673 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:00:57.328185    3673 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:00:57.328220    3673 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:00:57.336122    3673 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:00:57.336458    3673 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.336540    3673 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:00:57.336737    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.337225    3673 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.337442    3673 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x4704660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:00:57.337765    3673 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:00:57.337950    3673 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:00:57.345178    3673 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:00:57.345192    3673 kubeadm.go:597] duration metric: took 17.003153ms to restartPrimaryControlPlane
	I0731 10:00:57.345197    3673 kubeadm.go:394] duration metric: took 37.024806ms to StartCluster
	I0731 10:00:57.345206    3673 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.345280    3673 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.345652    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.345873    3673 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:00:57.345886    3673 start.go:241] waiting for startup goroutines ...
	I0731 10:00:57.345893    3673 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:00:57.346021    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:57.387797    3673 out.go:177] * Enabled addons: 
	I0731 10:00:57.409811    3673 addons.go:510] duration metric: took 63.912845ms for enable addons: enabled=[]
	I0731 10:00:57.409921    3673 start.go:246] waiting for cluster config update ...
	I0731 10:00:57.409935    3673 start.go:255] writing updated cluster config ...
	I0731 10:00:57.431766    3673 out.go:177] 
	I0731 10:00:57.453235    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:57.453375    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.475712    3673 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:00:57.517781    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:57.517815    3673 cache.go:56] Caching tarball of preloaded images
	I0731 10:00:57.517989    3673 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:00:57.518009    3673 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:00:57.518145    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.519033    3673 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:00:57.519139    3673 start.go:364] duration metric: took 81.076µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:00:57.519165    3673 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:00:57.519174    3673 fix.go:54] fixHost starting: m02
	I0731 10:00:57.519609    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:57.519636    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:57.528542    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51833
	I0731 10:00:57.528874    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:57.529218    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:57.529229    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:57.529483    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:57.529615    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:57.529706    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:00:57.529814    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.529894    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 10:00:57.530866    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3205 missing from process table
	I0731 10:00:57.530885    3673 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:00:57.530896    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:00:57.530989    3673 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:00:57.572756    3673 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:00:57.593944    3673 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:00:57.594326    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.594377    3673 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:00:57.596218    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3205 missing from process table
	I0731 10:00:57.596230    3673 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3205 is in state "Stopped"
	I0731 10:00:57.596246    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:00:57.596604    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:00:57.624044    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:00:57.624071    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:00:57.624255    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000385aa0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:57.624298    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000385aa0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:57.624343    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:00:57.624392    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:00:57.624400    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:00:57.625747    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Pid is 3703
	I0731 10:00:57.626235    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:00:57.626251    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.626342    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:00:57.627854    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:00:57.627953    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:00:57.627977    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbf3e}
	I0731 10:00:57.628011    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:00:57.628034    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbebe}
	I0731 10:00:57.628052    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:00:57.628055    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:00:57.628122    3673 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:00:57.628751    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:00:57.628985    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.629386    3673 machine.go:94] provisionDockerMachine start ...
	I0731 10:00:57.629397    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:57.629529    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:00:57.629660    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:00:57.629776    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:57.629863    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:57.629968    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:00:57.630086    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:57.630251    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:00:57.630259    3673 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:00:57.633394    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:00:57.642233    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:00:57.643220    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:57.643237    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:57.643246    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:57.643254    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:58.024328    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:00:58.024355    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:00:58.138946    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:58.138964    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:58.138972    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:58.138978    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:58.139795    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:00:58.139805    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:01:03.704070    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:01:03.704154    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:01:03.704165    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:01:03.727851    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:01:08.691307    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:01:08.691320    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.691494    3673 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:01:08.691506    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.691592    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.691677    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:08.691764    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.691853    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.691954    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:08.692085    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:08.692236    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:08.692245    3673 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:01:08.755413    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:01:08.755429    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.755569    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:08.755667    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.755765    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.755854    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:08.755980    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:08.756132    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:08.756143    3673 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:01:08.818688    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:01:08.818702    3673 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:01:08.818713    3673 buildroot.go:174] setting up certificates
	I0731 10:01:08.818719    3673 provision.go:84] configureAuth start
	I0731 10:01:08.818725    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.818852    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:01:08.818933    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.819011    3673 provision.go:143] copyHostCerts
	I0731 10:01:08.819041    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:01:08.819091    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:01:08.819096    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:01:08.819226    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:01:08.819432    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:01:08.819463    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:01:08.819467    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:01:08.819545    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:01:08.819683    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:01:08.819712    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:01:08.819716    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:01:08.819792    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:01:08.819938    3673 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:01:09.050116    3673 provision.go:177] copyRemoteCerts
	I0731 10:01:09.050171    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:01:09.050188    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.050328    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.050426    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.050517    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.050597    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:09.085881    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:01:09.085963    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:01:09.105721    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:01:09.105784    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:01:09.125488    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:01:09.125555    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:01:09.145164    3673 provision.go:87] duration metric: took 326.438057ms to configureAuth
	I0731 10:01:09.145176    3673 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:01:09.145335    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:01:09.145348    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:09.145480    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.145573    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.145655    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.145735    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.145811    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.145938    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.146068    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.146076    3673 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:01:09.201832    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:01:09.201843    3673 buildroot.go:70] root file system type: tmpfs
	I0731 10:01:09.201923    3673 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:01:09.201934    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.202081    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.202179    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.202271    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.202354    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.202487    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.202618    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.202666    3673 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:01:09.267323    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:01:09.267343    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.267478    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.267567    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.267645    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.267729    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.267847    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.267982    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.267994    3673 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:01:10.914498    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:01:10.914513    3673 machine.go:97] duration metric: took 13.285120143s to provisionDockerMachine
	I0731 10:01:10.914520    3673 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:01:10.914527    3673 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:01:10.914537    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:10.914733    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:01:10.914747    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:10.914855    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:10.914953    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:10.915048    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:10.915144    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:10.959674    3673 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:01:10.963100    3673 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:01:10.963114    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:01:10.963203    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:01:10.963349    3673 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:01:10.963357    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:01:10.963530    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:01:10.972659    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:01:11.000647    3673 start.go:296] duration metric: took 86.118358ms for postStartSetup
	I0731 10:01:11.000670    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.000870    3673 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:01:11.000882    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.001001    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.001098    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.001173    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.001251    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:11.035137    3673 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:01:11.035197    3673 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:01:11.088856    3673 fix.go:56] duration metric: took 13.569681658s for fixHost
	I0731 10:01:11.088895    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.089041    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.089136    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.089222    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.089315    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.089453    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:11.089599    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:11.089606    3673 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0731 10:01:11.145954    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445271.149143027
	
	I0731 10:01:11.145966    3673 fix.go:216] guest clock: 1722445271.149143027
	I0731 10:01:11.145974    3673 fix.go:229] Guest: 2024-07-31 10:01:11.149143027 -0700 PDT Remote: 2024-07-31 10:01:11.088876 -0700 PDT m=+34.629889069 (delta=60.267027ms)
	I0731 10:01:11.145984    3673 fix.go:200] guest clock delta is within tolerance: 60.267027ms
	I0731 10:01:11.145988    3673 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.626840447s
	I0731 10:01:11.146004    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.146144    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:01:11.168821    3673 out.go:177] * Found network options:
	I0731 10:01:11.189411    3673 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:01:11.210445    3673 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:01:11.210484    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211359    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211621    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211769    3673 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:01:11.211807    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:01:11.211854    3673 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:01:11.211952    3673 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:01:11.211972    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.212003    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.212195    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.212236    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.212390    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.212455    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.212607    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.212628    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:11.212741    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:01:11.245224    3673 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:01:11.245288    3673 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:01:11.292468    3673 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:01:11.292485    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:01:11.292564    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:01:11.308790    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:01:11.317853    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:01:11.326752    3673 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:01:11.326791    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:01:11.335723    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:01:11.344565    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:01:11.353617    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:01:11.362526    3673 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:01:11.371536    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:01:11.380589    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:01:11.389630    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:01:11.398848    3673 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:01:11.407046    3673 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:01:11.415065    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:01:11.507632    3673 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:01:11.526508    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:01:11.526575    3673 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:01:11.541590    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:01:11.552707    3673 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:01:11.574170    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:01:11.585642    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:01:11.595961    3673 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:01:11.615167    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:01:11.625493    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:01:11.640509    3673 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:01:11.643540    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:01:11.650600    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:01:11.664458    3673 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:01:11.766555    3673 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:01:11.880513    3673 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:01:11.880542    3673 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:01:11.894469    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:01:11.987172    3673 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:02:12.930966    3673 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1m0.943784713s)
	I0731 10:02:12.931036    3673 ssh_runner.go:195] Run: sudo journalctl --no-pager -u docker
	I0731 10:02:12.964792    3673 out.go:177] 
	W0731 10:02:12.986436    3673 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Jul 31 17:01:09 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.540836219Z" level=info msg="Starting up"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541317477Z" level=info msg="containerd not running, starting managed containerd"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541838265Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=494
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.560371937Z" level=info msg="starting containerd" revision=8fc6bcff51318944179630522a095cc9dbf9f353 version=v1.7.20
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576586336Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576670079Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576715322Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576725763Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576901546Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576942171Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577133168Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577170137Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577183696Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577195352Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577298762Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577522478Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579179447Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579249243Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579392843Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579426223Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579535672Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579581480Z" level=info msg="metadata content store policy set" policy=shared
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581466765Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581512332Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581524910Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581534838Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581543733Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581622493Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581841090Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581949001Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581963875Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581973012Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581991066Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582002817Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582011239Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582020290Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582029399Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582037767Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582045966Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582053831Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582071064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582081124Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582091080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582100077Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582108106Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582116349Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582123631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582131784Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582141489Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582150904Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582158314Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582166201Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582174064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582185286Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582198762Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582207204Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582214973Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582263286Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582297170Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582306849Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582315043Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582321631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582330079Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582340119Z" level=info msg="NRI interface is disabled by configuration."
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582481302Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582557809Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582588544Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582634793Z" level=info msg="containerd successfully booted in 0.023010s"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.561555310Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.591279604Z" level=info msg="Loading containers: start."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.773936432Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.836555927Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.880452097Z" level=info msg="Loading containers: done."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887082310Z" level=info msg="Docker daemon" commit=cc13f95 containerd-snapshotter=false storage-driver=overlay2 version=27.1.1
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887241928Z" level=info msg="Daemon has completed initialization"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912027531Z" level=info msg="API listen on /var/run/docker.sock"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912107549Z" level=info msg="API listen on [::]:2376"
	Jul 31 17:01:10 ha-393000-m02 systemd[1]: Started Docker Application Container Engine.
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.014698698Z" level=info msg="Processing signal 'terminated'"
	Jul 31 17:01:12 ha-393000-m02 systemd[1]: Stopping Docker Application Container Engine...
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.015851363Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016218102Z" level=info msg="Daemon shutdown complete"
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016264206Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016277600Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: docker.service: Deactivated successfully.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Stopped Docker Application Container Engine.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:13 ha-393000-m02 dockerd[1165]: time="2024-07-31T17:01:13.051074379Z" level=info msg="Starting up"
	Jul 31 17:02:13 ha-393000-m02 dockerd[1165]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Failed with result 'exit-code'.
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Jul 31 17:01:09 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.540836219Z" level=info msg="Starting up"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541317477Z" level=info msg="containerd not running, starting managed containerd"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541838265Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=494
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.560371937Z" level=info msg="starting containerd" revision=8fc6bcff51318944179630522a095cc9dbf9f353 version=v1.7.20
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576586336Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576670079Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576715322Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576725763Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576901546Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576942171Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577133168Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577170137Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577183696Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577195352Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577298762Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577522478Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579179447Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579249243Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579392843Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579426223Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579535672Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579581480Z" level=info msg="metadata content store policy set" policy=shared
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581466765Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581512332Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581524910Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581534838Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581543733Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581622493Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581841090Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581949001Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581963875Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581973012Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581991066Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582002817Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582011239Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582020290Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582029399Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582037767Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582045966Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582053831Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582071064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582081124Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582091080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582100077Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582108106Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582116349Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582123631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582131784Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582141489Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582150904Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582158314Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582166201Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582174064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582185286Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582198762Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582207204Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582214973Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582263286Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582297170Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582306849Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582315043Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582321631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582330079Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582340119Z" level=info msg="NRI interface is disabled by configuration."
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582481302Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582557809Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582588544Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582634793Z" level=info msg="containerd successfully booted in 0.023010s"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.561555310Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.591279604Z" level=info msg="Loading containers: start."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.773936432Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.836555927Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.880452097Z" level=info msg="Loading containers: done."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887082310Z" level=info msg="Docker daemon" commit=cc13f95 containerd-snapshotter=false storage-driver=overlay2 version=27.1.1
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887241928Z" level=info msg="Daemon has completed initialization"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912027531Z" level=info msg="API listen on /var/run/docker.sock"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912107549Z" level=info msg="API listen on [::]:2376"
	Jul 31 17:01:10 ha-393000-m02 systemd[1]: Started Docker Application Container Engine.
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.014698698Z" level=info msg="Processing signal 'terminated'"
	Jul 31 17:01:12 ha-393000-m02 systemd[1]: Stopping Docker Application Container Engine...
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.015851363Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016218102Z" level=info msg="Daemon shutdown complete"
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016264206Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016277600Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: docker.service: Deactivated successfully.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Stopped Docker Application Container Engine.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:13 ha-393000-m02 dockerd[1165]: time="2024-07-31T17:01:13.051074379Z" level=info msg="Starting up"
	Jul 31 17:02:13 ha-393000-m02 dockerd[1165]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Failed with result 'exit-code'.
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	W0731 10:02:12.986535    3673 out.go:239] * 
	* 
	W0731 10:02:12.987705    3673 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:02:13.049576    3673 out.go:177] 

                                                
                                                
** /stderr **
ha_test.go:469: failed to run minikube start. args "out/minikube-darwin-amd64 node list -p ha-393000 -v=7 --alsologtostderr" : exit status 90
ha_test.go:472: (dbg) Run:  out/minikube-darwin-amd64 node list -p ha-393000
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000: exit status 2 (148.431514ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:239: status error: exit status 2 (may be ok)
helpers_test.go:244: <<< TestMultiControlPlane/serial/RestartClusterKeepsNodes FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/RestartClusterKeepsNodes]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (2.111257703s)
helpers_test.go:252: TestMultiControlPlane/serial/RestartClusterKeepsNodes logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node start m02 -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:59 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000 -v=7               | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-393000 -v=7                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT | 31 Jul 24 10:00 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 10:00:36
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 10:00:36.495656    3673 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:00:36.495846    3673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:36.495851    3673 out.go:304] Setting ErrFile to fd 2...
	I0731 10:00:36.495855    3673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:36.496034    3673 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:00:36.497502    3673 out.go:298] Setting JSON to false
	I0731 10:00:36.520791    3673 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1806,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:00:36.520876    3673 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:00:36.543180    3673 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:00:36.586807    3673 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:00:36.586859    3673 notify.go:220] Checking for updates...
	I0731 10:00:36.634547    3673 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:36.676763    3673 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:00:36.720444    3673 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:00:36.764479    3673 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:00:36.807390    3673 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:00:36.829325    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:36.829489    3673 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:00:36.830157    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.830242    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:36.839843    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51805
	I0731 10:00:36.840174    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:36.840633    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:36.840652    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:36.840857    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:36.840969    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:36.869457    3673 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:00:36.911576    3673 start.go:297] selected driver: hyperkit
	I0731 10:00:36.911605    3673 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:fals
e efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p20
00.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:36.911854    3673 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:00:36.912050    3673 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:00:36.912259    3673 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:00:36.921863    3673 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:00:36.925765    3673 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.925786    3673 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:00:36.929097    3673 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:00:36.929172    3673 cni.go:84] Creating CNI manager for ""
	I0731 10:00:36.929182    3673 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:00:36.929256    3673 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:36.929358    3673 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:00:36.971579    3673 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:00:36.992648    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:36.992745    3673 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:00:36.992770    3673 cache.go:56] Caching tarball of preloaded images
	I0731 10:00:36.992959    3673 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:00:36.992977    3673 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:00:36.993162    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:36.993985    3673 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:00:36.994105    3673 start.go:364] duration metric: took 96.51µs to acquireMachinesLock for "ha-393000"
	I0731 10:00:36.994138    3673 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:00:36.994154    3673 fix.go:54] fixHost starting: 
	I0731 10:00:36.994597    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.994672    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:37.003590    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51807
	I0731 10:00:37.003945    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:37.004312    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:37.004326    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:37.004582    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:37.004711    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:37.004817    3673 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:00:37.004901    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.004979    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 10:00:37.005943    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 2965 missing from process table
	I0731 10:00:37.005965    3673 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:00:37.005979    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:00:37.006061    3673 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:00:37.048650    3673 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:00:37.069570    3673 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:00:37.069827    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.069877    3673 main.go:141] libmachine: (ha-393000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:00:37.071916    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 2965 missing from process table
	I0731 10:00:37.071929    3673 main.go:141] libmachine: (ha-393000) DBG | pid 2965 is in state "Stopped"
	I0731 10:00:37.071946    3673 main.go:141] libmachine: (ha-393000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid...
	I0731 10:00:37.072316    3673 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:00:37.238669    3673 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:00:37.238692    3673 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:00:37.238840    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c1020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:37.238867    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c1020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:37.238912    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:00:37.238957    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:00:37.238973    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:00:37.240553    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Pid is 3685
	I0731 10:00:37.240991    3673 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:00:37.241011    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.241081    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:00:37.243087    3673 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:00:37.243212    3673 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:00:37.243230    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:00:37.243264    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbebe}
	I0731 10:00:37.243290    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:00:37.243299    3673 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:00:37.243315    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 10:00:37.243326    3673 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:00:37.243339    3673 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:00:37.243975    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:37.244206    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:37.244644    3673 machine.go:94] provisionDockerMachine start ...
	I0731 10:00:37.244655    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:37.244765    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:37.244869    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:37.244966    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:37.245080    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:37.245168    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:37.245280    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:37.245490    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:37.245498    3673 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:00:37.248660    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:00:37.300309    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:00:37.301000    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:37.301012    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:37.301019    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:37.301029    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:37.684614    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:00:37.684630    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:00:37.799569    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:37.799605    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:37.799644    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:37.799683    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:37.800441    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:00:37.800452    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:00:43.367703    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:00:43.367775    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:00:43.367785    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:00:43.391726    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:00:47.223864    3673 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:00:50.289047    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:00:50.289062    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.289203    3673 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:00:50.289213    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.289308    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.289398    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.289487    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.289585    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.289691    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.289830    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.289999    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.290007    3673 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:00:50.363752    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:00:50.363772    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.363906    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.364014    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.364093    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.364179    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.364291    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.364432    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.364443    3673 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:00:50.433878    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:00:50.433898    3673 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:00:50.433922    3673 buildroot.go:174] setting up certificates
	I0731 10:00:50.433931    3673 provision.go:84] configureAuth start
	I0731 10:00:50.433938    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.434079    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:50.434192    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.434286    3673 provision.go:143] copyHostCerts
	I0731 10:00:50.434327    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:00:50.434402    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:00:50.434411    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:00:50.434544    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:00:50.434743    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:00:50.434783    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:00:50.434794    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:00:50.434879    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:00:50.435018    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:00:50.435058    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:00:50.435063    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:00:50.435178    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:00:50.435321    3673 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:00:50.506730    3673 provision.go:177] copyRemoteCerts
	I0731 10:00:50.506778    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:00:50.506790    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.506910    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.507000    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.507081    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.507175    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:50.545550    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:00:50.545628    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:00:50.565303    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:00:50.565359    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:00:50.584957    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:00:50.585022    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:00:50.604884    3673 provision.go:87] duration metric: took 170.940154ms to configureAuth
	I0731 10:00:50.604897    3673 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:00:50.605065    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:50.605078    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:50.605206    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.605298    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.605377    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.605465    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.605532    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.605631    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.605760    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.605768    3673 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:00:50.667182    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:00:50.667193    3673 buildroot.go:70] root file system type: tmpfs
	I0731 10:00:50.667267    3673 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:00:50.667284    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.667424    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.667511    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.667602    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.667687    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.667814    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.668002    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.668046    3673 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:00:50.740959    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:00:50.740978    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.741112    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.741203    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.741293    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.741371    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.741505    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.741655    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.741668    3673 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:00:52.502025    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:00:52.502040    3673 machine.go:97] duration metric: took 15.257390709s to provisionDockerMachine
	I0731 10:00:52.502051    3673 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:00:52.502059    3673 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:00:52.502069    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.502248    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:00:52.502270    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.502370    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.502470    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.502555    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.502643    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.547198    3673 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:00:52.551739    3673 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:00:52.551756    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:00:52.551866    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:00:52.552049    3673 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:00:52.552056    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:00:52.552268    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:00:52.560032    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:00:52.588419    3673 start.go:296] duration metric: took 86.358285ms for postStartSetup
	I0731 10:00:52.588445    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.588627    3673 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:00:52.588639    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.588726    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.588826    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.588924    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.589019    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.626220    3673 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:00:52.626280    3673 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:00:52.679056    3673 fix.go:56] duration metric: took 15.684908032s for fixHost
	I0731 10:00:52.679077    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.679217    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.679309    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.679414    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.679512    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.679640    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:52.679796    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:52.679804    3673 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:00:52.743374    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445252.855355235
	
	I0731 10:00:52.743388    3673 fix.go:216] guest clock: 1722445252.855355235
	I0731 10:00:52.743395    3673 fix.go:229] Guest: 2024-07-31 10:00:52.855355235 -0700 PDT Remote: 2024-07-31 10:00:52.679067 -0700 PDT m=+16.220078068 (delta=176.288235ms)
	I0731 10:00:52.743428    3673 fix.go:200] guest clock delta is within tolerance: 176.288235ms
	I0731 10:00:52.743433    3673 start.go:83] releasing machines lock for "ha-393000", held for 15.749318892s
	I0731 10:00:52.743452    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.743591    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:52.743689    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.743983    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.744104    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.744194    3673 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:00:52.744225    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.744256    3673 ssh_runner.go:195] Run: cat /version.json
	I0731 10:00:52.744267    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.744309    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.744357    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.744392    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.744456    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.744481    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.744548    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.744567    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.744621    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.782140    3673 ssh_runner.go:195] Run: systemctl --version
	I0731 10:00:52.830138    3673 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:00:52.835167    3673 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:00:52.835215    3673 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:00:52.850271    3673 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:00:52.850283    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:00:52.850381    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:00:52.866468    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:00:52.875484    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:00:52.884391    3673 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:00:52.884442    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:00:52.893278    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:00:52.902262    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:00:52.911487    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:00:52.920398    3673 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:00:52.929249    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:00:52.938174    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:00:52.947153    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:00:52.956120    3673 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:00:52.964110    3673 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:00:52.972137    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:53.078103    3673 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:00:53.097675    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:00:53.097756    3673 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:00:53.112446    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:00:53.127406    3673 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:00:53.144131    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:00:53.155196    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:00:53.165334    3673 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:00:53.191499    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:00:53.201932    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:00:53.215879    3673 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:00:53.218826    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:00:53.226091    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:00:53.239772    3673 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:00:53.345187    3673 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:00:53.464204    3673 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:00:53.464287    3673 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:00:53.478304    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:53.574350    3673 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:00:55.900933    3673 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.32656477s)
	I0731 10:00:55.900999    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:00:55.912322    3673 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:00:55.925824    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:00:55.936801    3673 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:00:56.031696    3673 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:00:56.126413    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.241279    3673 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:00:56.258174    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:00:56.269382    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.365935    3673 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:00:56.430017    3673 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:00:56.430103    3673 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:00:56.435941    3673 start.go:563] Will wait 60s for crictl version
	I0731 10:00:56.435987    3673 ssh_runner.go:195] Run: which crictl
	I0731 10:00:56.439039    3673 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:00:56.464351    3673 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:00:56.464431    3673 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:00:56.482582    3673 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:00:56.520920    3673 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:00:56.520980    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:56.521425    3673 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:00:56.525996    3673 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:00:56.535672    3673 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:00:56.535754    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:56.535821    3673 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:00:56.552650    3673 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:00:56.552662    3673 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:00:56.552737    3673 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:00:56.566645    3673 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:00:56.566662    3673 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:00:56.566671    3673 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:00:56.566751    3673 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:00:56.566818    3673 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:00:56.602960    3673 cni.go:84] Creating CNI manager for ""
	I0731 10:00:56.602973    3673 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:00:56.602987    3673 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:00:56.603002    3673 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:00:56.603093    3673 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:00:56.603113    3673 kube-vip.go:115] generating kube-vip config ...
	I0731 10:00:56.603160    3673 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:00:56.615328    3673 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:00:56.615398    3673 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:00:56.615448    3673 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:00:56.624876    3673 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:00:56.624925    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:00:56.632950    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:00:56.646343    3673 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:00:56.659992    3673 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:00:56.674079    3673 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:00:56.687863    3673 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:00:56.690766    3673 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:00:56.700932    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.802080    3673 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:00:56.817267    3673 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:00:56.817279    3673 certs.go:194] generating shared ca certs ...
	I0731 10:00:56.817290    3673 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.817479    3673 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:00:56.817554    3673 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:00:56.817564    3673 certs.go:256] generating profile certs ...
	I0731 10:00:56.817680    3673 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:00:56.817703    3673 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:00:56.817718    3673 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0731 10:00:56.884314    3673 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e ...
	I0731 10:00:56.884330    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e: {Name:mk4c6f4a11277f3afefbfb19687b3bc0d7252c4a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.884782    3673 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e ...
	I0731 10:00:56.884793    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e: {Name:mk5943238cbce29d53e24742be6a5a17eba24882 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.885016    3673 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 10:00:56.885227    3673 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 10:00:56.885483    3673 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:00:56.885493    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:00:56.885517    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:00:56.885538    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:00:56.885559    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:00:56.885578    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:00:56.885598    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:00:56.885616    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:00:56.885638    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:00:56.885737    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:00:56.885783    3673 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:00:56.885791    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:00:56.885821    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:00:56.885850    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:00:56.885884    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:00:56.885950    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:00:56.885985    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:00:56.886006    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:56.886025    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:00:56.886457    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:00:56.908157    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:00:56.929396    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:00:56.960758    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:00:56.990308    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:00:57.032527    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:00:57.067819    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:00:57.106993    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:00:57.128481    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:00:57.148005    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:00:57.167023    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:00:57.187176    3673 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:00:57.200671    3673 ssh_runner.go:195] Run: openssl version
	I0731 10:00:57.204930    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:00:57.213254    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.216716    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.216751    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.221199    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:00:57.229492    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:00:57.237806    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.241238    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.241272    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.245487    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:00:57.253849    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:00:57.262176    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.265564    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.265596    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.269893    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:00:57.278252    3673 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:00:57.281742    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:00:57.286355    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:00:57.290726    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:00:57.295185    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:00:57.299486    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:00:57.303679    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:00:57.308178    3673 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:57.308287    3673 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:00:57.320531    3673 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:00:57.328175    3673 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:00:57.328185    3673 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:00:57.328220    3673 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:00:57.336122    3673 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:00:57.336458    3673 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.336540    3673 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:00:57.336737    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.337225    3673 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.337442    3673 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x4704660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:00:57.337765    3673 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:00:57.337950    3673 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:00:57.345178    3673 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:00:57.345192    3673 kubeadm.go:597] duration metric: took 17.003153ms to restartPrimaryControlPlane
	I0731 10:00:57.345197    3673 kubeadm.go:394] duration metric: took 37.024806ms to StartCluster
	I0731 10:00:57.345206    3673 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.345280    3673 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.345652    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.345873    3673 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:00:57.345886    3673 start.go:241] waiting for startup goroutines ...
	I0731 10:00:57.345893    3673 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:00:57.346021    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:57.387797    3673 out.go:177] * Enabled addons: 
	I0731 10:00:57.409811    3673 addons.go:510] duration metric: took 63.912845ms for enable addons: enabled=[]
	I0731 10:00:57.409921    3673 start.go:246] waiting for cluster config update ...
	I0731 10:00:57.409935    3673 start.go:255] writing updated cluster config ...
	I0731 10:00:57.431766    3673 out.go:177] 
	I0731 10:00:57.453235    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:57.453375    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.475712    3673 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:00:57.517781    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:57.517815    3673 cache.go:56] Caching tarball of preloaded images
	I0731 10:00:57.517989    3673 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:00:57.518009    3673 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:00:57.518145    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.519033    3673 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:00:57.519139    3673 start.go:364] duration metric: took 81.076µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:00:57.519165    3673 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:00:57.519174    3673 fix.go:54] fixHost starting: m02
	I0731 10:00:57.519609    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:57.519636    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:57.528542    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51833
	I0731 10:00:57.528874    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:57.529218    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:57.529229    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:57.529483    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:57.529615    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:57.529706    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:00:57.529814    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.529894    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 10:00:57.530866    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3205 missing from process table
	I0731 10:00:57.530885    3673 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:00:57.530896    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:00:57.530989    3673 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:00:57.572756    3673 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:00:57.593944    3673 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:00:57.594326    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.594377    3673 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:00:57.596218    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3205 missing from process table
	I0731 10:00:57.596230    3673 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3205 is in state "Stopped"
	I0731 10:00:57.596246    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:00:57.596604    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:00:57.624044    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:00:57.624071    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:00:57.624255    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000385aa0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:57.624298    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000385aa0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:57.624343    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:00:57.624392    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:00:57.624400    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:00:57.625747    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Pid is 3703
	I0731 10:00:57.626235    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:00:57.626251    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.626342    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:00:57.627854    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:00:57.627953    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:00:57.627977    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbf3e}
	I0731 10:00:57.628011    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:00:57.628034    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbebe}
	I0731 10:00:57.628052    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:00:57.628055    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:00:57.628122    3673 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:00:57.628751    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:00:57.628985    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.629386    3673 machine.go:94] provisionDockerMachine start ...
	I0731 10:00:57.629397    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:57.629529    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:00:57.629660    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:00:57.629776    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:57.629863    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:57.629968    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:00:57.630086    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:57.630251    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:00:57.630259    3673 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:00:57.633394    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:00:57.642233    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:00:57.643220    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:57.643237    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:57.643246    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:57.643254    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:58.024328    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:00:58.024355    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:00:58.138946    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:58.138964    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:58.138972    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:58.138978    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:58.139795    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:00:58.139805    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:01:03.704070    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:01:03.704154    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:01:03.704165    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:01:03.727851    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:01:08.691307    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:01:08.691320    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.691494    3673 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:01:08.691506    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.691592    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.691677    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:08.691764    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.691853    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.691954    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:08.692085    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:08.692236    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:08.692245    3673 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:01:08.755413    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:01:08.755429    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.755569    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:08.755667    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.755765    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.755854    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:08.755980    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:08.756132    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:08.756143    3673 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:01:08.818688    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:01:08.818702    3673 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:01:08.818713    3673 buildroot.go:174] setting up certificates
	I0731 10:01:08.818719    3673 provision.go:84] configureAuth start
	I0731 10:01:08.818725    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.818852    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:01:08.818933    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.819011    3673 provision.go:143] copyHostCerts
	I0731 10:01:08.819041    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:01:08.819091    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:01:08.819096    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:01:08.819226    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:01:08.819432    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:01:08.819463    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:01:08.819467    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:01:08.819545    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:01:08.819683    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:01:08.819712    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:01:08.819716    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:01:08.819792    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:01:08.819938    3673 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:01:09.050116    3673 provision.go:177] copyRemoteCerts
	I0731 10:01:09.050171    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:01:09.050188    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.050328    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.050426    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.050517    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.050597    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:09.085881    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:01:09.085963    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:01:09.105721    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:01:09.105784    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:01:09.125488    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:01:09.125555    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:01:09.145164    3673 provision.go:87] duration metric: took 326.438057ms to configureAuth
	I0731 10:01:09.145176    3673 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:01:09.145335    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:01:09.145348    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:09.145480    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.145573    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.145655    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.145735    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.145811    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.145938    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.146068    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.146076    3673 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:01:09.201832    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:01:09.201843    3673 buildroot.go:70] root file system type: tmpfs
	I0731 10:01:09.201923    3673 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:01:09.201934    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.202081    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.202179    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.202271    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.202354    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.202487    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.202618    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.202666    3673 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:01:09.267323    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:01:09.267343    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.267478    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.267567    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.267645    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.267729    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.267847    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.267982    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.267994    3673 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:01:10.914498    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:01:10.914513    3673 machine.go:97] duration metric: took 13.285120143s to provisionDockerMachine
	I0731 10:01:10.914520    3673 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:01:10.914527    3673 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:01:10.914537    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:10.914733    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:01:10.914747    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:10.914855    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:10.914953    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:10.915048    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:10.915144    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:10.959674    3673 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:01:10.963100    3673 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:01:10.963114    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:01:10.963203    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:01:10.963349    3673 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:01:10.963357    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:01:10.963530    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:01:10.972659    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:01:11.000647    3673 start.go:296] duration metric: took 86.118358ms for postStartSetup
	I0731 10:01:11.000670    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.000870    3673 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:01:11.000882    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.001001    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.001098    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.001173    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.001251    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:11.035137    3673 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:01:11.035197    3673 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:01:11.088856    3673 fix.go:56] duration metric: took 13.569681658s for fixHost
	I0731 10:01:11.088895    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.089041    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.089136    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.089222    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.089315    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.089453    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:11.089599    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:11.089606    3673 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:01:11.145954    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445271.149143027
	
	I0731 10:01:11.145966    3673 fix.go:216] guest clock: 1722445271.149143027
	I0731 10:01:11.145974    3673 fix.go:229] Guest: 2024-07-31 10:01:11.149143027 -0700 PDT Remote: 2024-07-31 10:01:11.088876 -0700 PDT m=+34.629889069 (delta=60.267027ms)
	I0731 10:01:11.145984    3673 fix.go:200] guest clock delta is within tolerance: 60.267027ms
	I0731 10:01:11.145988    3673 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.626840447s
	I0731 10:01:11.146004    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.146144    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:01:11.168821    3673 out.go:177] * Found network options:
	I0731 10:01:11.189411    3673 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:01:11.210445    3673 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:01:11.210484    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211359    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211621    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211769    3673 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:01:11.211807    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:01:11.211854    3673 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:01:11.211952    3673 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:01:11.211972    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.212003    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.212195    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.212236    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.212390    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.212455    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.212607    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.212628    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:11.212741    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:01:11.245224    3673 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:01:11.245288    3673 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:01:11.292468    3673 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:01:11.292485    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:01:11.292564    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:01:11.308790    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:01:11.317853    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:01:11.326752    3673 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:01:11.326791    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:01:11.335723    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:01:11.344565    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:01:11.353617    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:01:11.362526    3673 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:01:11.371536    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:01:11.380589    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:01:11.389630    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:01:11.398848    3673 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:01:11.407046    3673 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:01:11.415065    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:01:11.507632    3673 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:01:11.526508    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:01:11.526575    3673 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:01:11.541590    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:01:11.552707    3673 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:01:11.574170    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:01:11.585642    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:01:11.595961    3673 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:01:11.615167    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:01:11.625493    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:01:11.640509    3673 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:01:11.643540    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:01:11.650600    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:01:11.664458    3673 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:01:11.766555    3673 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:01:11.880513    3673 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:01:11.880542    3673 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:01:11.894469    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:01:11.987172    3673 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:02:12.930966    3673 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1m0.943784713s)
	I0731 10:02:12.931036    3673 ssh_runner.go:195] Run: sudo journalctl --no-pager -u docker
	I0731 10:02:12.964792    3673 out.go:177] 
	W0731 10:02:12.986436    3673 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Jul 31 17:01:09 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.540836219Z" level=info msg="Starting up"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541317477Z" level=info msg="containerd not running, starting managed containerd"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541838265Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=494
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.560371937Z" level=info msg="starting containerd" revision=8fc6bcff51318944179630522a095cc9dbf9f353 version=v1.7.20
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576586336Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576670079Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576715322Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576725763Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576901546Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576942171Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577133168Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577170137Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577183696Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577195352Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577298762Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577522478Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579179447Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579249243Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579392843Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579426223Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579535672Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579581480Z" level=info msg="metadata content store policy set" policy=shared
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581466765Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581512332Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581524910Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581534838Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581543733Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581622493Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581841090Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581949001Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581963875Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581973012Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581991066Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582002817Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582011239Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582020290Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582029399Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582037767Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582045966Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582053831Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582071064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582081124Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582091080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582100077Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582108106Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582116349Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582123631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582131784Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582141489Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582150904Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582158314Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582166201Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582174064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582185286Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582198762Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582207204Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582214973Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582263286Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582297170Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582306849Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582315043Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582321631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582330079Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582340119Z" level=info msg="NRI interface is disabled by configuration."
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582481302Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582557809Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582588544Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582634793Z" level=info msg="containerd successfully booted in 0.023010s"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.561555310Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.591279604Z" level=info msg="Loading containers: start."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.773936432Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.836555927Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.880452097Z" level=info msg="Loading containers: done."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887082310Z" level=info msg="Docker daemon" commit=cc13f95 containerd-snapshotter=false storage-driver=overlay2 version=27.1.1
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887241928Z" level=info msg="Daemon has completed initialization"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912027531Z" level=info msg="API listen on /var/run/docker.sock"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912107549Z" level=info msg="API listen on [::]:2376"
	Jul 31 17:01:10 ha-393000-m02 systemd[1]: Started Docker Application Container Engine.
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.014698698Z" level=info msg="Processing signal 'terminated'"
	Jul 31 17:01:12 ha-393000-m02 systemd[1]: Stopping Docker Application Container Engine...
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.015851363Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016218102Z" level=info msg="Daemon shutdown complete"
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016264206Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016277600Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: docker.service: Deactivated successfully.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Stopped Docker Application Container Engine.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:13 ha-393000-m02 dockerd[1165]: time="2024-07-31T17:01:13.051074379Z" level=info msg="Starting up"
	Jul 31 17:02:13 ha-393000-m02 dockerd[1165]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Failed with result 'exit-code'.
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	W0731 10:02:12.986535    3673 out.go:239] * 
	W0731 10:02:12.987705    3673 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:02:13.049576    3673 out.go:177] 
	
	
	==> Docker <==
	Jul 31 17:01:03 ha-393000 dockerd[1182]: time="2024-07-31T17:01:03.955592926Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:25 ha-393000 dockerd[1176]: time="2024-07-31T17:01:25.077687038Z" level=info msg="ignoring event" container=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.078615114Z" level=info msg="shim disconnected" id=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 namespace=moby
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.079358383Z" level=warning msg="cleaning up after shim disconnected" id=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 namespace=moby
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.079401742Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1176]: time="2024-07-31T17:01:26.090101886Z" level=info msg="ignoring event" container=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090726839Z" level=info msg="shim disconnected" id=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090779643Z" level=warning msg="cleaning up after shim disconnected" id=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090788476Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149678222Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149721539Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149733538Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149849844Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.145857339Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146175734Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146320206Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146525965Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:02:03 ha-393000 dockerd[1176]: time="2024-07-31T17:02:03.637026559Z" level=info msg="ignoring event" container=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.636863270Z" level=info msg="shim disconnected" id=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb namespace=moby
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.637470771Z" level=warning msg="cleaning up after shim disconnected" id=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb namespace=moby
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.637576887Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770900358Z" level=info msg="shim disconnected" id=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770966496Z" level=warning msg="cleaning up after shim disconnected" id=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770975588Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1176]: time="2024-07-31T17:02:11.771381326Z" level=info msg="ignoring event" container=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	dd8064769032d       76932a3b37d7e                                                                                         23 seconds ago       Exited              kube-controller-manager   2                   626ea84aade06       kube-controller-manager-ha-393000
	e557dfd18a90c       1f6d574d502f3                                                                                         31 seconds ago       Exited              kube-apiserver            2                   194073f1c5ac9       kube-apiserver-ha-393000
	86018b08bbaa1       3861cfcd7c04c                                                                                         About a minute ago   Running             etcd                      1                   ba75e4f4299bf       etcd-ha-393000
	5fcb6f7d8ab78       38af8ddebf499                                                                                         About a minute ago   Running             kube-vip                  0                   e6198932cc027       kube-vip-ha-393000
	d088fefe5f8e3       3edc18e7b7672                                                                                         About a minute ago   Running             kube-scheduler            1                   f04a7ecd568d2       kube-scheduler-ha-393000
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   5 minutes ago        Exited              busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         7 minutes ago        Exited              coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         7 minutes ago        Exited              coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	6d966e37d3618       6e38f40d628db                                                                                         7 minutes ago        Exited              storage-provisioner       0                   25b3d6db405f4       storage-provisioner
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              8 minutes ago        Exited              kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         8 minutes ago        Exited              kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	e68314e525ef8       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     8 minutes ago        Exited              kube-vip                  0                   c9f21d49b1384       kube-vip-ha-393000
	63e56744c84ee       3861cfcd7c04c                                                                                         8 minutes ago        Exited              etcd                      0                   f8f20b1290499       etcd-ha-393000
	65412448c586b       3edc18e7b7672                                                                                         8 minutes ago        Exited              kube-scheduler            0                   7ab9affa89eca       kube-scheduler-ha-393000
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [feda36fb8a03] <==
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E0731 17:02:14.418704    2573 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:14.419060    2573 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:14.420604    2573 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:14.420837    2573 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:14.422579    2573 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[  +0.000000] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000001] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.035932] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.007966] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.670742] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007034] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.684430] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.291972] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000007] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +2.476130] systemd-fstab-generator[479]: Ignoring "noauto" option for root device
	[  +0.098443] systemd-fstab-generator[491]: Ignoring "noauto" option for root device
	[  +1.350473] kauditd_printk_skb: 42 callbacks suppressed
	[  +0.720403] systemd-fstab-generator[1105]: Ignoring "noauto" option for root device
	[  +0.253450] systemd-fstab-generator[1142]: Ignoring "noauto" option for root device
	[  +0.122356] systemd-fstab-generator[1154]: Ignoring "noauto" option for root device
	[  +0.120102] systemd-fstab-generator[1168]: Ignoring "noauto" option for root device
	[  +2.456184] systemd-fstab-generator[1382]: Ignoring "noauto" option for root device
	[  +0.101772] systemd-fstab-generator[1394]: Ignoring "noauto" option for root device
	[  +0.097595] systemd-fstab-generator[1406]: Ignoring "noauto" option for root device
	[  +0.132878] systemd-fstab-generator[1422]: Ignoring "noauto" option for root device
	[  +0.441269] systemd-fstab-generator[1586]: Ignoring "noauto" option for root device
	[Jul31 17:01] kauditd_printk_skb: 271 callbacks suppressed
	[ +21.458149] kauditd_printk_skb: 40 callbacks suppressed
	
	
	==> etcd [63e56744c84e] <==
	{"level":"info","ts":"2024-07-31T17:00:28.797806Z","caller":"traceutil/trace.go:171","msg":"trace[2004834865] range","detail":"{range_begin:/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath; range_end:; }","duration":"7.769094578s","start":"2024-07-31T17:00:21.028708Z","end":"2024-07-31T17:00:28.797803Z","steps":["trace[2004834865] 'agreement among raft nodes before linearized reading'  (duration: 7.769083488s)"],"step_count":1}
	{"level":"warn","ts":"2024-07-31T17:00:28.797814Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-07-31T17:00:21.028699Z","time spent":"7.769113485s","remote":"127.0.0.1:48488","response type":"/etcdserverpb.KV/Range","request count":0,"request size":67,"response count":0,"response size":0,"request content":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" "}
	2024/07/31 17:00:28 WARNING: [core] [Server #5] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-07-31T17:00:28.797862Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-07-31T17:00:27.051463Z","time spent":"1.746397856s","remote":"127.0.0.1:48588","response type":"/etcdserverpb.KV/Txn","request count":0,"request size":0,"response count":0,"response size":0,"request content":""}
	2024/07/31 17:00:28 WARNING: [core] [Server #5] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-07-31T17:00:28.892734Z","caller":"embed/serve.go:212","msg":"stopping secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"warn","ts":"2024-07-31T17:00:28.892779Z","caller":"embed/serve.go:214","msg":"stopped secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"info","ts":"2024-07-31T17:00:28.892813Z","caller":"etcdserver/server.go:1462","msg":"skipped leadership transfer; local server is not leader","local-member-id":"b8c6c7563d17d844","current-leader-member-id":"0"}
	{"level":"info","ts":"2024-07-31T17:00:28.893872Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.89389Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.893909Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894003Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894029Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894052Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.89406Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894064Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894069Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894081Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894256Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.89428Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894301Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.89431Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.900461Z","caller":"embed/etcd.go:579","msg":"stopping serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-07-31T17:00:28.900567Z","caller":"embed/etcd.go:584","msg":"stopped serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-07-31T17:00:28.900576Z","caller":"embed/etcd.go:377","msg":"closed etcd server","name":"ha-393000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.5:2380"],"advertise-client-urls":["https://192.169.0.5:2379"]}
	
	
	==> etcd [86018b08bbaa] <==
	{"level":"info","ts":"2024-07-31T17:02:09.106978Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:02:09.364619Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:02:09.364696Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-07-31T17:02:09.36471Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-07-31T17:02:09.364717Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"info","ts":"2024-07-31T17:02:10.908301Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:10.908382Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:10.908402Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:10.908419Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:10.908429Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706453Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706497Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706792Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706867Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706926Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:02:14.336186Z","caller":"etcdserver/server.go:2089","msg":"failed to publish local member to cluster through raft","local-member-id":"b8c6c7563d17d844","local-member-attributes":"{Name:ha-393000 ClientURLs:[https://192.169.0.5:2379]}","request-path":"/0/members/b8c6c7563d17d844/attributes","publish-timeout":"7s","error":"etcdserver: request timed out"}
	{"level":"warn","ts":"2024-07-31T17:02:14.364851Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:02:14.364914Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: i/o timeout"}
	{"level":"warn","ts":"2024-07-31T17:02:14.364925Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: i/o timeout"}
	{"level":"warn","ts":"2024-07-31T17:02:14.364923Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"info","ts":"2024-07-31T17:02:14.507119Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507183Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507199Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507209Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	
	
	==> kernel <==
	 17:02:14 up 1 min,  0 users,  load average: 0.17, 0.09, 0.03
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:59:40.110698       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:50.118349       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:50.118427       1 main.go:299] handling current node
	I0731 16:59:50.118450       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:50.118464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:50.118651       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:50.118739       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.118883       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:00.118987       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:00.119126       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:00.119236       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.119356       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:00.119483       1 main.go:299] handling current node
	I0731 17:00:10.110002       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:10.111054       1 main.go:299] handling current node
	I0731 17:00:10.111286       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:10.111319       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:10.111445       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:10.111480       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:20.116250       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:20.116442       1 main.go:299] handling current node
	I0731 17:00:20.116458       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:20.116464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:20.116608       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:20.116672       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [e557dfd18a90] <==
	I0731 17:01:43.256034       1 options.go:221] external host was not specified, using 192.169.0.5
	I0731 17:01:43.256765       1 server.go:148] Version: v1.30.3
	I0731 17:01:43.256805       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:01:43.612164       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0731 17:01:43.614459       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:01:43.616925       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0731 17:01:43.616979       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0731 17:01:43.617139       1 instance.go:299] Using reconciler: lease
	W0731 17:02:03.611949       1 logging.go:59] [core] [Channel #2 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0731 17:02:03.612150       1 logging.go:59] [core] [Channel #1 SubChannel #3] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0731 17:02:03.618964       1 instance.go:292] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-controller-manager [dd8064769032] <==
	I0731 17:01:51.487714       1 serving.go:380] Generated self-signed cert in-memory
	I0731 17:01:51.747070       1 controllermanager.go:189] "Starting" version="v1.30.3"
	I0731 17:01:51.747213       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:01:51.750433       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0731 17:01:51.750809       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:01:51.750880       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:01:51.750924       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0731 17:02:11.753015       1 controllermanager.go:234] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.169.0.5:8443/healthz\": dial tcp 192.169.0.5:8443: connect: connection refused"
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [65412448c586] <==
	E0731 16:53:48.491132       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0731 16:53:48.491335       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:48.491387       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0731 16:53:48.491507       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0731 16:53:48.491594       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0731 16:53:48.491662       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:48.491738       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:48.491818       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:48.491860       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:48.491537       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:48.491873       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.319781       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0731 16:53:49.319838       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0731 16:53:49.326442       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.326478       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.392116       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:49.392172       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:49.496014       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.496036       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.541411       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:49.541927       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:49.588695       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:49.588735       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0731 16:53:49.982415       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0731 17:00:28.842533       1 run.go:74] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [d088fefe5f8e] <==
	E0731 17:01:56.894936       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": net/http: TLS handshake timeout
	W0731 17:01:56.937531       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": net/http: TLS handshake timeout
	I0731 17:01:56.937697       1 trace.go:236] Trace[712161407]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jul-2024 17:01:46.936) (total time: 10001ms):
	Trace[712161407]: ---"Objects listed" error:Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (17:01:56.937)
	Trace[712161407]: [10.001085583s] [10.001085583s] END
	E0731 17:01:56.937717       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": net/http: TLS handshake timeout
	W0731 17:01:57.191734       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": net/http: TLS handshake timeout
	I0731 17:01:57.191804       1 trace.go:236] Trace[36189415]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jul-2024 17:01:47.191) (total time: 10000ms):
	Trace[36189415]: ---"Objects listed" error:Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (17:01:57.191)
	Trace[36189415]: [10.000695567s] [10.000695567s] END
	E0731 17:01:57.191818       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": net/http: TLS handshake timeout
	W0731 17:02:04.142372       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:04.142613       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:04.628727       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55278->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.628988       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55278->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.628727       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55294->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629137       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55294->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.629274       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55260->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629350       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55260->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.629024       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: Get "https://192.169.0.5:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55272->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629603       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://192.169.0.5:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55272->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:05.172498       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:05.172740       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:14.697016       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:14.697068       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	
	
	==> kubelet <==
	Jul 31 17:01:57 ha-393000 kubelet[1593]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:01:57 ha-393000 kubelet[1593]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:01:57 ha-393000 kubelet[1593]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:01:57 ha-393000 kubelet[1593]: E0731 17:01:57.152916    1593 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ha-393000\" not found"
	Jul 31 17:01:58 ha-393000 kubelet[1593]: E0731 17:01:58.501133    1593 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://control-plane.minikube.internal:8443/api/v1/namespaces/default/events\": dial tcp 192.169.0.254:8443: connect: no route to host" event="&Event{ObjectMeta:{ha-393000.17e75ad5dc50e2da  default    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ha-393000,UID:ha-393000,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ha-393000,},FirstTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,LastTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ha-393000,}"
	Jul 31 17:02:02 ha-393000 kubelet[1593]: I0731 17:02:02.431480    1593 kubelet_node_status.go:73] "Attempting to register node" node="ha-393000"
	Jul 31 17:02:03 ha-393000 kubelet[1593]: I0731 17:02:03.874945    1593 scope.go:117] "RemoveContainer" containerID="8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7"
	Jul 31 17:02:03 ha-393000 kubelet[1593]: I0731 17:02:03.875815    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:03 ha-393000 kubelet[1593]: E0731 17:02:03.876093    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:04 ha-393000 kubelet[1593]: E0731 17:02:04.641093    1593 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://control-plane.minikube.internal:8443/api/v1/nodes\": dial tcp 192.169.0.254:8443: connect: no route to host" node="ha-393000"
	Jul 31 17:02:04 ha-393000 kubelet[1593]: E0731 17:02:04.641160    1593 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://control-plane.minikube.internal:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ha-393000?timeout=10s\": dial tcp 192.169.0.254:8443: connect: no route to host" interval="7s"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: E0731 17:02:07.161123    1593 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ha-393000\" not found"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: I0731 17:02:07.713966    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: E0731 17:02:07.714569    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: I0731 17:02:10.692480    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: E0731 17:02:10.693256    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: E0731 17:02:10.785079    1593 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://control-plane.minikube.internal:8443/api/v1/namespaces/default/events\": dial tcp 192.169.0.254:8443: connect: no route to host" event="&Event{ObjectMeta:{ha-393000.17e75ad5dc50e2da  default    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ha-393000,UID:ha-393000,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ha-393000,},FirstTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,LastTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ha-393000,}"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.644681    1593 kubelet_node_status.go:73] "Attempting to register node" node="ha-393000"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.957788    1593 scope.go:117] "RemoveContainer" containerID="375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.958478    1593 scope.go:117] "RemoveContainer" containerID="dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: E0731 17:02:11.958720    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-ha-393000_kube-system(ae5c50a5b151d76ab8b2e88315db2b23)\"" pod="kube-system/kube-controller-manager-ha-393000" podUID="ae5c50a5b151d76ab8b2e88315db2b23"
	Jul 31 17:02:13 ha-393000 kubelet[1593]: W0731 17:02:13.855738    1593 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://control-plane.minikube.internal:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.254:8443: connect: no route to host
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855784    1593 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://control-plane.minikube.internal:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.254:8443: connect: no route to host
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855832    1593 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://control-plane.minikube.internal:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ha-393000?timeout=10s\": dial tcp 192.169.0.254:8443: connect: no route to host" interval="7s"
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855916    1593 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://control-plane.minikube.internal:8443/api/v1/nodes\": dial tcp 192.169.0.254:8443: connect: no route to host" node="ha-393000"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:254: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000: exit status 2 (147.02716ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:254: status error: exit status 2 (may be ok)
helpers_test.go:256: "ha-393000" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestMultiControlPlane/serial/RestartClusterKeepsNodes (126.35s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (2.92s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:487: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 node delete m03 -v=7 --alsologtostderr
ha_test.go:487: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 node delete m03 -v=7 --alsologtostderr: exit status 83 (173.187505ms)

                                                
                                                
-- stdout --
	* The control-plane node ha-393000-m03 host is not running: state=Stopped
	  To start a cluster, run: "minikube start -p ha-393000"

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:02:15.701843    3734 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:02:15.702219    3734 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:02:15.702225    3734 out.go:304] Setting ErrFile to fd 2...
	I0731 10:02:15.702229    3734 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:02:15.702414    3734 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:02:15.702748    3734 mustload.go:65] Loading cluster: ha-393000
	I0731 10:02:15.703090    3734 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:02:15.703464    3734 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.703497    3734 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.711617    3734 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51876
	I0731 10:02:15.712015    3734 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.712433    3734 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.712444    3734 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.712659    3734 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.712774    3734 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:02:15.712865    3734 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:02:15.712922    3734 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:02:15.713907    3734 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:02:15.714138    3734 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.714159    3734 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.722634    3734 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51878
	I0731 10:02:15.722997    3734 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.723387    3734 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.723411    3734 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.723636    3734 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.723754    3734 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:02:15.724099    3734 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.724123    3734 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.732416    3734 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51880
	I0731 10:02:15.732744    3734 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.733092    3734 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.733106    3734 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.733296    3734 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.733399    3734 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:02:15.733473    3734 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:02:15.733556    3734 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:02:15.734559    3734 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 10:02:15.734838    3734 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.734877    3734 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.743171    3734 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51882
	I0731 10:02:15.743505    3734 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.743824    3734 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.743844    3734 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.744065    3734 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.744181    3734 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:02:15.744531    3734 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.744558    3734 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.752702    3734 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51884
	I0731 10:02:15.753032    3734 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.753379    3734 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.753397    3734 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.753616    3734 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.753738    3734 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:02:15.753818    3734 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:02:15.753909    3734 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:02:15.754858    3734 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:02:15.778066    3734 out.go:177] * The control-plane node ha-393000-m03 host is not running: state=Stopped
	I0731 10:02:15.799361    3734 out.go:177]   To start a cluster, run: "minikube start -p ha-393000"

                                                
                                                
** /stderr **
ha_test.go:489: node delete returned an error. args "out/minikube-darwin-amd64 -p ha-393000 node delete m03 -v=7 --alsologtostderr": exit status 83
ha_test.go:493: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:493: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 7 (253.779154ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-393000-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:02:15.874810    3741 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:02:15.874998    3741 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:02:15.875003    3741 out.go:304] Setting ErrFile to fd 2...
	I0731 10:02:15.875007    3741 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:02:15.875187    3741 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:02:15.875372    3741 out.go:298] Setting JSON to false
	I0731 10:02:15.875393    3741 mustload.go:65] Loading cluster: ha-393000
	I0731 10:02:15.875434    3741 notify.go:220] Checking for updates...
	I0731 10:02:15.875694    3741 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:02:15.875710    3741 status.go:255] checking status of ha-393000 ...
	I0731 10:02:15.876048    3741 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.876090    3741 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.885090    3741 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51887
	I0731 10:02:15.885435    3741 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.885831    3741 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.885846    3741 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.886048    3741 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.886153    3741 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:02:15.886241    3741 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:02:15.886307    3741 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:02:15.887342    3741 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 10:02:15.887368    3741 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:02:15.887640    3741 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.887691    3741 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.896089    3741 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51889
	I0731 10:02:15.896414    3741 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.896773    3741 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.896785    3741 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.897012    3741 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.897124    3741 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:02:15.897210    3741 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:02:15.897457    3741 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.897479    3741 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.907074    3741 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51891
	I0731 10:02:15.907390    3741 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.907708    3741 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.907720    3741 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.907915    3741 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.908025    3741 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:02:15.908170    3741 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:02:15.908190    3741 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:02:15.908259    3741 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:02:15.908332    3741 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:02:15.908410    3741 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:02:15.908512    3741 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:02:15.942780    3741 ssh_runner.go:195] Run: systemctl --version
	I0731 10:02:15.947517    3741 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:02:15.958190    3741 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:02:15.958213    3741 api_server.go:166] Checking apiserver status ...
	I0731 10:02:15.958250    3741 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0731 10:02:15.969107    3741 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:02:15.969117    3741 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 10:02:15.969125    3741 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:02:15.969136    3741 status.go:255] checking status of ha-393000-m02 ...
	I0731 10:02:15.969418    3741 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.969438    3741 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.978162    3741 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51894
	I0731 10:02:15.978502    3741 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.978855    3741 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.978873    3741 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.979072    3741 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.979185    3741 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:02:15.979270    3741 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:02:15.979337    3741 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:02:15.980318    3741 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 10:02:15.980350    3741 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 10:02:15.980617    3741 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.980637    3741 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.989138    3741 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51896
	I0731 10:02:15.989502    3741 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.989852    3741 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.989866    3741 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.990094    3741 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.990191    3741 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:02:15.990279    3741 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 10:02:15.990545    3741 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:15.990571    3741 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:15.999002    3741 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51898
	I0731 10:02:15.999315    3741 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:15.999626    3741 main.go:141] libmachine: Using API Version  1
	I0731 10:02:15.999637    3741 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:15.999842    3741 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:15.999987    3741 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:02:16.000115    3741 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:02:16.000126    3741 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:02:16.000199    3741 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:02:16.000278    3741 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:02:16.000358    3741 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:02:16.000433    3741 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:02:16.031544    3741 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:02:16.042491    3741 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:02:16.042505    3741 api_server.go:166] Checking apiserver status ...
	I0731 10:02:16.042541    3741 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0731 10:02:16.052071    3741 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:02:16.052080    3741 status.go:422] ha-393000-m02 apiserver status = Stopped (err=<nil>)
	I0731 10:02:16.052089    3741 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:02:16.052099    3741 status.go:255] checking status of ha-393000-m03 ...
	I0731 10:02:16.052350    3741 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:16.052376    3741 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:16.060941    3741 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51901
	I0731 10:02:16.061321    3741 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:16.061638    3741 main.go:141] libmachine: Using API Version  1
	I0731 10:02:16.061654    3741 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:16.061888    3741 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:16.062004    3741 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:02:16.062089    3741 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:02:16.062161    3741 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:02:16.063140    3741 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:02:16.063164    3741 status.go:330] ha-393000-m03 host status = "Stopped" (err=<nil>)
	I0731 10:02:16.063172    3741 status.go:343] host is not running, skipping remaining checks
	I0731 10:02:16.063186    3741 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:02:16.063200    3741 status.go:255] checking status of ha-393000-m04 ...
	I0731 10:02:16.063470    3741 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:02:16.063507    3741 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:02:16.072132    3741 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51903
	I0731 10:02:16.072483    3741 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:02:16.072794    3741 main.go:141] libmachine: Using API Version  1
	I0731 10:02:16.072805    3741 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:02:16.073000    3741 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:02:16.073109    3741 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:02:16.073198    3741 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:02:16.073276    3741 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 10:02:16.074223    3741 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid 3095 missing from process table
	I0731 10:02:16.074264    3741 status.go:330] ha-393000-m04 host status = "Stopped" (err=<nil>)
	I0731 10:02:16.074277    3741 status.go:343] host is not running, skipping remaining checks
	I0731 10:02:16.074284    3741 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:495: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr" : exit status 7
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000: exit status 2 (147.20363ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:239: status error: exit status 2 (may be ok)
helpers_test.go:244: <<< TestMultiControlPlane/serial/DeleteSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/DeleteSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (2.147076455s)
helpers_test.go:252: TestMultiControlPlane/serial/DeleteSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node start m02 -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:59 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000 -v=7               | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-393000 -v=7                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT | 31 Jul 24 10:00 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	| node    | ha-393000 node delete m03 -v=7       | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 10:00:36
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 10:00:36.495656    3673 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:00:36.495846    3673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:36.495851    3673 out.go:304] Setting ErrFile to fd 2...
	I0731 10:00:36.495855    3673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:36.496034    3673 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:00:36.497502    3673 out.go:298] Setting JSON to false
	I0731 10:00:36.520791    3673 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1806,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:00:36.520876    3673 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:00:36.543180    3673 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:00:36.586807    3673 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:00:36.586859    3673 notify.go:220] Checking for updates...
	I0731 10:00:36.634547    3673 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:36.676763    3673 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:00:36.720444    3673 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:00:36.764479    3673 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:00:36.807390    3673 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:00:36.829325    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:36.829489    3673 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:00:36.830157    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.830242    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:36.839843    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51805
	I0731 10:00:36.840174    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:36.840633    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:36.840652    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:36.840857    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:36.840969    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:36.869457    3673 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:00:36.911576    3673 start.go:297] selected driver: hyperkit
	I0731 10:00:36.911605    3673 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:fals
e efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p20
00.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:36.911854    3673 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:00:36.912050    3673 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:00:36.912259    3673 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:00:36.921863    3673 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:00:36.925765    3673 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.925786    3673 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:00:36.929097    3673 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:00:36.929172    3673 cni.go:84] Creating CNI manager for ""
	I0731 10:00:36.929182    3673 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:00:36.929256    3673 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:36.929358    3673 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:00:36.971579    3673 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:00:36.992648    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:36.992745    3673 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:00:36.992770    3673 cache.go:56] Caching tarball of preloaded images
	I0731 10:00:36.992959    3673 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:00:36.992977    3673 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:00:36.993162    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:36.993985    3673 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:00:36.994105    3673 start.go:364] duration metric: took 96.51µs to acquireMachinesLock for "ha-393000"
	I0731 10:00:36.994138    3673 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:00:36.994154    3673 fix.go:54] fixHost starting: 
	I0731 10:00:36.994597    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.994672    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:37.003590    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51807
	I0731 10:00:37.003945    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:37.004312    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:37.004326    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:37.004582    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:37.004711    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:37.004817    3673 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:00:37.004901    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.004979    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 10:00:37.005943    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 2965 missing from process table
	I0731 10:00:37.005965    3673 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:00:37.005979    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:00:37.006061    3673 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:00:37.048650    3673 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:00:37.069570    3673 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:00:37.069827    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.069877    3673 main.go:141] libmachine: (ha-393000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:00:37.071916    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 2965 missing from process table
	I0731 10:00:37.071929    3673 main.go:141] libmachine: (ha-393000) DBG | pid 2965 is in state "Stopped"
	I0731 10:00:37.071946    3673 main.go:141] libmachine: (ha-393000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid...
	I0731 10:00:37.072316    3673 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:00:37.238669    3673 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:00:37.238692    3673 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:00:37.238840    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c1020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:37.238867    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c1020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:37.238912    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:00:37.238957    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:00:37.238973    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:00:37.240553    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Pid is 3685
	I0731 10:00:37.240991    3673 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:00:37.241011    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.241081    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:00:37.243087    3673 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:00:37.243212    3673 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:00:37.243230    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:00:37.243264    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbebe}
	I0731 10:00:37.243290    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:00:37.243299    3673 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:00:37.243315    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 10:00:37.243326    3673 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:00:37.243339    3673 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:00:37.243975    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:37.244206    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:37.244644    3673 machine.go:94] provisionDockerMachine start ...
	I0731 10:00:37.244655    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:37.244765    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:37.244869    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:37.244966    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:37.245080    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:37.245168    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:37.245280    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:37.245490    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:37.245498    3673 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:00:37.248660    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:00:37.300309    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:00:37.301000    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:37.301012    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:37.301019    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:37.301029    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:37.684614    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:00:37.684630    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:00:37.799569    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:37.799605    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:37.799644    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:37.799683    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:37.800441    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:00:37.800452    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:00:43.367703    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:00:43.367775    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:00:43.367785    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:00:43.391726    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:00:47.223864    3673 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:00:50.289047    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:00:50.289062    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.289203    3673 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:00:50.289213    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.289308    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.289398    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.289487    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.289585    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.289691    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.289830    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.289999    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.290007    3673 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:00:50.363752    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:00:50.363772    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.363906    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.364014    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.364093    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.364179    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.364291    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.364432    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.364443    3673 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:00:50.433878    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:00:50.433898    3673 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:00:50.433922    3673 buildroot.go:174] setting up certificates
	I0731 10:00:50.433931    3673 provision.go:84] configureAuth start
	I0731 10:00:50.433938    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.434079    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:50.434192    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.434286    3673 provision.go:143] copyHostCerts
	I0731 10:00:50.434327    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:00:50.434402    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:00:50.434411    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:00:50.434544    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:00:50.434743    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:00:50.434783    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:00:50.434794    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:00:50.434879    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:00:50.435018    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:00:50.435058    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:00:50.435063    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:00:50.435178    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:00:50.435321    3673 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:00:50.506730    3673 provision.go:177] copyRemoteCerts
	I0731 10:00:50.506778    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:00:50.506790    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.506910    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.507000    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.507081    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.507175    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:50.545550    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:00:50.545628    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:00:50.565303    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:00:50.565359    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:00:50.584957    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:00:50.585022    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:00:50.604884    3673 provision.go:87] duration metric: took 170.940154ms to configureAuth
	I0731 10:00:50.604897    3673 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:00:50.605065    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:50.605078    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:50.605206    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.605298    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.605377    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.605465    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.605532    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.605631    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.605760    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.605768    3673 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:00:50.667182    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:00:50.667193    3673 buildroot.go:70] root file system type: tmpfs
	I0731 10:00:50.667267    3673 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:00:50.667284    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.667424    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.667511    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.667602    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.667687    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.667814    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.668002    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.668046    3673 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:00:50.740959    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:00:50.740978    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.741112    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.741203    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.741293    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.741371    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.741505    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.741655    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.741668    3673 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:00:52.502025    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:00:52.502040    3673 machine.go:97] duration metric: took 15.257390709s to provisionDockerMachine
	I0731 10:00:52.502051    3673 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:00:52.502059    3673 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:00:52.502069    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.502248    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:00:52.502270    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.502370    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.502470    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.502555    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.502643    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.547198    3673 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:00:52.551739    3673 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:00:52.551756    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:00:52.551866    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:00:52.552049    3673 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:00:52.552056    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:00:52.552268    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:00:52.560032    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:00:52.588419    3673 start.go:296] duration metric: took 86.358285ms for postStartSetup
	I0731 10:00:52.588445    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.588627    3673 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:00:52.588639    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.588726    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.588826    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.588924    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.589019    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.626220    3673 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:00:52.626280    3673 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:00:52.679056    3673 fix.go:56] duration metric: took 15.684908032s for fixHost
	I0731 10:00:52.679077    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.679217    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.679309    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.679414    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.679512    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.679640    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:52.679796    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:52.679804    3673 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:00:52.743374    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445252.855355235
	
	I0731 10:00:52.743388    3673 fix.go:216] guest clock: 1722445252.855355235
	I0731 10:00:52.743395    3673 fix.go:229] Guest: 2024-07-31 10:00:52.855355235 -0700 PDT Remote: 2024-07-31 10:00:52.679067 -0700 PDT m=+16.220078068 (delta=176.288235ms)
	I0731 10:00:52.743428    3673 fix.go:200] guest clock delta is within tolerance: 176.288235ms
	I0731 10:00:52.743433    3673 start.go:83] releasing machines lock for "ha-393000", held for 15.749318892s
	I0731 10:00:52.743452    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.743591    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:52.743689    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.743983    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.744104    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.744194    3673 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:00:52.744225    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.744256    3673 ssh_runner.go:195] Run: cat /version.json
	I0731 10:00:52.744267    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.744309    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.744357    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.744392    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.744456    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.744481    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.744548    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.744567    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.744621    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.782140    3673 ssh_runner.go:195] Run: systemctl --version
	I0731 10:00:52.830138    3673 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:00:52.835167    3673 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:00:52.835215    3673 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:00:52.850271    3673 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:00:52.850283    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:00:52.850381    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:00:52.866468    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:00:52.875484    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:00:52.884391    3673 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:00:52.884442    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:00:52.893278    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:00:52.902262    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:00:52.911487    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:00:52.920398    3673 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:00:52.929249    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:00:52.938174    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:00:52.947153    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:00:52.956120    3673 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:00:52.964110    3673 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:00:52.972137    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:53.078103    3673 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:00:53.097675    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:00:53.097756    3673 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:00:53.112446    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:00:53.127406    3673 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:00:53.144131    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:00:53.155196    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:00:53.165334    3673 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:00:53.191499    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:00:53.201932    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:00:53.215879    3673 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:00:53.218826    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:00:53.226091    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:00:53.239772    3673 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:00:53.345187    3673 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:00:53.464204    3673 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:00:53.464287    3673 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:00:53.478304    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:53.574350    3673 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:00:55.900933    3673 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.32656477s)
	I0731 10:00:55.900999    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:00:55.912322    3673 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:00:55.925824    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:00:55.936801    3673 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:00:56.031696    3673 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:00:56.126413    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.241279    3673 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:00:56.258174    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:00:56.269382    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.365935    3673 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:00:56.430017    3673 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:00:56.430103    3673 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:00:56.435941    3673 start.go:563] Will wait 60s for crictl version
	I0731 10:00:56.435987    3673 ssh_runner.go:195] Run: which crictl
	I0731 10:00:56.439039    3673 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:00:56.464351    3673 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:00:56.464431    3673 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:00:56.482582    3673 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:00:56.520920    3673 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:00:56.520980    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:56.521425    3673 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:00:56.525996    3673 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:00:56.535672    3673 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:00:56.535754    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:56.535821    3673 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:00:56.552650    3673 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:00:56.552662    3673 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:00:56.552737    3673 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:00:56.566645    3673 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:00:56.566662    3673 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:00:56.566671    3673 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:00:56.566751    3673 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:00:56.566818    3673 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:00:56.602960    3673 cni.go:84] Creating CNI manager for ""
	I0731 10:00:56.602973    3673 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:00:56.602987    3673 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:00:56.603002    3673 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:00:56.603093    3673 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:00:56.603113    3673 kube-vip.go:115] generating kube-vip config ...
	I0731 10:00:56.603160    3673 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:00:56.615328    3673 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:00:56.615398    3673 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:00:56.615448    3673 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:00:56.624876    3673 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:00:56.624925    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:00:56.632950    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:00:56.646343    3673 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:00:56.659992    3673 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:00:56.674079    3673 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:00:56.687863    3673 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:00:56.690766    3673 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:00:56.700932    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.802080    3673 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:00:56.817267    3673 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:00:56.817279    3673 certs.go:194] generating shared ca certs ...
	I0731 10:00:56.817290    3673 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.817479    3673 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:00:56.817554    3673 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:00:56.817564    3673 certs.go:256] generating profile certs ...
	I0731 10:00:56.817680    3673 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:00:56.817703    3673 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:00:56.817718    3673 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0731 10:00:56.884314    3673 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e ...
	I0731 10:00:56.884330    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e: {Name:mk4c6f4a11277f3afefbfb19687b3bc0d7252c4a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.884782    3673 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e ...
	I0731 10:00:56.884793    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e: {Name:mk5943238cbce29d53e24742be6a5a17eba24882 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.885016    3673 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 10:00:56.885227    3673 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 10:00:56.885483    3673 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:00:56.885493    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:00:56.885517    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:00:56.885538    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:00:56.885559    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:00:56.885578    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:00:56.885598    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:00:56.885616    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:00:56.885638    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:00:56.885737    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:00:56.885783    3673 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:00:56.885791    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:00:56.885821    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:00:56.885850    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:00:56.885884    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:00:56.885950    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:00:56.885985    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:00:56.886006    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:56.886025    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:00:56.886457    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:00:56.908157    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:00:56.929396    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:00:56.960758    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:00:56.990308    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:00:57.032527    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:00:57.067819    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:00:57.106993    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:00:57.128481    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:00:57.148005    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:00:57.167023    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:00:57.187176    3673 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:00:57.200671    3673 ssh_runner.go:195] Run: openssl version
	I0731 10:00:57.204930    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:00:57.213254    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.216716    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.216751    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.221199    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:00:57.229492    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:00:57.237806    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.241238    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.241272    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.245487    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:00:57.253849    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:00:57.262176    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.265564    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.265596    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.269893    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:00:57.278252    3673 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:00:57.281742    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:00:57.286355    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:00:57.290726    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:00:57.295185    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:00:57.299486    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:00:57.303679    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:00:57.308178    3673 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:57.308287    3673 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:00:57.320531    3673 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:00:57.328175    3673 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:00:57.328185    3673 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:00:57.328220    3673 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:00:57.336122    3673 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:00:57.336458    3673 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.336540    3673 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:00:57.336737    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.337225    3673 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.337442    3673 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x4704660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:00:57.337765    3673 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:00:57.337950    3673 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:00:57.345178    3673 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:00:57.345192    3673 kubeadm.go:597] duration metric: took 17.003153ms to restartPrimaryControlPlane
	I0731 10:00:57.345197    3673 kubeadm.go:394] duration metric: took 37.024806ms to StartCluster
	I0731 10:00:57.345206    3673 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.345280    3673 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.345652    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.345873    3673 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:00:57.345886    3673 start.go:241] waiting for startup goroutines ...
	I0731 10:00:57.345893    3673 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:00:57.346021    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:57.387797    3673 out.go:177] * Enabled addons: 
	I0731 10:00:57.409811    3673 addons.go:510] duration metric: took 63.912845ms for enable addons: enabled=[]
	I0731 10:00:57.409921    3673 start.go:246] waiting for cluster config update ...
	I0731 10:00:57.409935    3673 start.go:255] writing updated cluster config ...
	I0731 10:00:57.431766    3673 out.go:177] 
	I0731 10:00:57.453235    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:57.453375    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.475712    3673 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:00:57.517781    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:57.517815    3673 cache.go:56] Caching tarball of preloaded images
	I0731 10:00:57.517989    3673 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:00:57.518009    3673 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:00:57.518145    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.519033    3673 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:00:57.519139    3673 start.go:364] duration metric: took 81.076µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:00:57.519165    3673 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:00:57.519174    3673 fix.go:54] fixHost starting: m02
	I0731 10:00:57.519609    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:57.519636    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:57.528542    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51833
	I0731 10:00:57.528874    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:57.529218    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:57.529229    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:57.529483    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:57.529615    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:57.529706    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:00:57.529814    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.529894    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 10:00:57.530866    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3205 missing from process table
	I0731 10:00:57.530885    3673 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:00:57.530896    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:00:57.530989    3673 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:00:57.572756    3673 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:00:57.593944    3673 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:00:57.594326    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.594377    3673 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:00:57.596218    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3205 missing from process table
	I0731 10:00:57.596230    3673 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3205 is in state "Stopped"
	I0731 10:00:57.596246    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:00:57.596604    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:00:57.624044    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:00:57.624071    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:00:57.624255    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000385aa0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:57.624298    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000385aa0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:57.624343    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:00:57.624392    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:00:57.624400    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:00:57.625747    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Pid is 3703
	I0731 10:00:57.626235    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:00:57.626251    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.626342    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:00:57.627854    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:00:57.627953    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:00:57.627977    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbf3e}
	I0731 10:00:57.628011    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:00:57.628034    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbebe}
	I0731 10:00:57.628052    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:00:57.628055    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:00:57.628122    3673 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:00:57.628751    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:00:57.628985    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.629386    3673 machine.go:94] provisionDockerMachine start ...
	I0731 10:00:57.629397    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:57.629529    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:00:57.629660    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:00:57.629776    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:57.629863    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:57.629968    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:00:57.630086    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:57.630251    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:00:57.630259    3673 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:00:57.633394    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:00:57.642233    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:00:57.643220    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:57.643237    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:57.643246    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:57.643254    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:58.024328    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:00:58.024355    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:00:58.138946    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:58.138964    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:58.138972    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:58.138978    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:58.139795    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:00:58.139805    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:01:03.704070    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:01:03.704154    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:01:03.704165    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:01:03.727851    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:01:08.691307    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:01:08.691320    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.691494    3673 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:01:08.691506    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.691592    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.691677    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:08.691764    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.691853    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.691954    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:08.692085    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:08.692236    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:08.692245    3673 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:01:08.755413    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:01:08.755429    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.755569    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:08.755667    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.755765    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.755854    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:08.755980    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:08.756132    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:08.756143    3673 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:01:08.818688    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:01:08.818702    3673 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:01:08.818713    3673 buildroot.go:174] setting up certificates
	I0731 10:01:08.818719    3673 provision.go:84] configureAuth start
	I0731 10:01:08.818725    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.818852    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:01:08.818933    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.819011    3673 provision.go:143] copyHostCerts
	I0731 10:01:08.819041    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:01:08.819091    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:01:08.819096    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:01:08.819226    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:01:08.819432    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:01:08.819463    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:01:08.819467    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:01:08.819545    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:01:08.819683    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:01:08.819712    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:01:08.819716    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:01:08.819792    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:01:08.819938    3673 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:01:09.050116    3673 provision.go:177] copyRemoteCerts
	I0731 10:01:09.050171    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:01:09.050188    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.050328    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.050426    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.050517    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.050597    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:09.085881    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:01:09.085963    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:01:09.105721    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:01:09.105784    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:01:09.125488    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:01:09.125555    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:01:09.145164    3673 provision.go:87] duration metric: took 326.438057ms to configureAuth
	I0731 10:01:09.145176    3673 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:01:09.145335    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:01:09.145348    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:09.145480    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.145573    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.145655    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.145735    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.145811    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.145938    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.146068    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.146076    3673 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:01:09.201832    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:01:09.201843    3673 buildroot.go:70] root file system type: tmpfs
	I0731 10:01:09.201923    3673 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:01:09.201934    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.202081    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.202179    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.202271    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.202354    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.202487    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.202618    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.202666    3673 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:01:09.267323    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:01:09.267343    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.267478    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.267567    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.267645    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.267729    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.267847    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.267982    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.267994    3673 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:01:10.914498    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:01:10.914513    3673 machine.go:97] duration metric: took 13.285120143s to provisionDockerMachine
	I0731 10:01:10.914520    3673 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:01:10.914527    3673 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:01:10.914537    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:10.914733    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:01:10.914747    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:10.914855    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:10.914953    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:10.915048    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:10.915144    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:10.959674    3673 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:01:10.963100    3673 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:01:10.963114    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:01:10.963203    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:01:10.963349    3673 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:01:10.963357    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:01:10.963530    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:01:10.972659    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:01:11.000647    3673 start.go:296] duration metric: took 86.118358ms for postStartSetup
	I0731 10:01:11.000670    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.000870    3673 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:01:11.000882    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.001001    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.001098    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.001173    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.001251    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:11.035137    3673 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:01:11.035197    3673 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:01:11.088856    3673 fix.go:56] duration metric: took 13.569681658s for fixHost
	I0731 10:01:11.088895    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.089041    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.089136    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.089222    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.089315    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.089453    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:11.089599    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:11.089606    3673 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:01:11.145954    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445271.149143027
	
	I0731 10:01:11.145966    3673 fix.go:216] guest clock: 1722445271.149143027
	I0731 10:01:11.145974    3673 fix.go:229] Guest: 2024-07-31 10:01:11.149143027 -0700 PDT Remote: 2024-07-31 10:01:11.088876 -0700 PDT m=+34.629889069 (delta=60.267027ms)
	I0731 10:01:11.145984    3673 fix.go:200] guest clock delta is within tolerance: 60.267027ms
	I0731 10:01:11.145988    3673 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.626840447s
	I0731 10:01:11.146004    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.146144    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:01:11.168821    3673 out.go:177] * Found network options:
	I0731 10:01:11.189411    3673 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:01:11.210445    3673 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:01:11.210484    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211359    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211621    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211769    3673 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:01:11.211807    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:01:11.211854    3673 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:01:11.211952    3673 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:01:11.211972    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.212003    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.212195    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.212236    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.212390    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.212455    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.212607    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.212628    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:11.212741    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:01:11.245224    3673 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:01:11.245288    3673 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:01:11.292468    3673 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:01:11.292485    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:01:11.292564    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:01:11.308790    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:01:11.317853    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:01:11.326752    3673 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:01:11.326791    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:01:11.335723    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:01:11.344565    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:01:11.353617    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:01:11.362526    3673 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:01:11.371536    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:01:11.380589    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:01:11.389630    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:01:11.398848    3673 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:01:11.407046    3673 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:01:11.415065    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:01:11.507632    3673 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:01:11.526508    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:01:11.526575    3673 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:01:11.541590    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:01:11.552707    3673 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:01:11.574170    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:01:11.585642    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:01:11.595961    3673 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:01:11.615167    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:01:11.625493    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:01:11.640509    3673 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:01:11.643540    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:01:11.650600    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:01:11.664458    3673 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:01:11.766555    3673 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:01:11.880513    3673 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:01:11.880542    3673 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:01:11.894469    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:01:11.987172    3673 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:02:12.930966    3673 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1m0.943784713s)
	I0731 10:02:12.931036    3673 ssh_runner.go:195] Run: sudo journalctl --no-pager -u docker
	I0731 10:02:12.964792    3673 out.go:177] 
	W0731 10:02:12.986436    3673 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Jul 31 17:01:09 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.540836219Z" level=info msg="Starting up"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541317477Z" level=info msg="containerd not running, starting managed containerd"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541838265Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=494
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.560371937Z" level=info msg="starting containerd" revision=8fc6bcff51318944179630522a095cc9dbf9f353 version=v1.7.20
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576586336Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576670079Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576715322Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576725763Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576901546Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576942171Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577133168Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577170137Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577183696Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577195352Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577298762Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577522478Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579179447Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579249243Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579392843Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579426223Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579535672Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579581480Z" level=info msg="metadata content store policy set" policy=shared
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581466765Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581512332Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581524910Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581534838Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581543733Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581622493Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581841090Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581949001Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581963875Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581973012Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581991066Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582002817Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582011239Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582020290Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582029399Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582037767Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582045966Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582053831Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582071064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582081124Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582091080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582100077Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582108106Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582116349Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582123631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582131784Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582141489Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582150904Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582158314Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582166201Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582174064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582185286Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582198762Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582207204Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582214973Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582263286Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582297170Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582306849Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582315043Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582321631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582330079Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582340119Z" level=info msg="NRI interface is disabled by configuration."
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582481302Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582557809Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582588544Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582634793Z" level=info msg="containerd successfully booted in 0.023010s"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.561555310Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.591279604Z" level=info msg="Loading containers: start."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.773936432Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.836555927Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.880452097Z" level=info msg="Loading containers: done."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887082310Z" level=info msg="Docker daemon" commit=cc13f95 containerd-snapshotter=false storage-driver=overlay2 version=27.1.1
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887241928Z" level=info msg="Daemon has completed initialization"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912027531Z" level=info msg="API listen on /var/run/docker.sock"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912107549Z" level=info msg="API listen on [::]:2376"
	Jul 31 17:01:10 ha-393000-m02 systemd[1]: Started Docker Application Container Engine.
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.014698698Z" level=info msg="Processing signal 'terminated'"
	Jul 31 17:01:12 ha-393000-m02 systemd[1]: Stopping Docker Application Container Engine...
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.015851363Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016218102Z" level=info msg="Daemon shutdown complete"
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016264206Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016277600Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: docker.service: Deactivated successfully.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Stopped Docker Application Container Engine.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:13 ha-393000-m02 dockerd[1165]: time="2024-07-31T17:01:13.051074379Z" level=info msg="Starting up"
	Jul 31 17:02:13 ha-393000-m02 dockerd[1165]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Failed with result 'exit-code'.
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	W0731 10:02:12.986535    3673 out.go:239] * 
	W0731 10:02:12.987705    3673 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:02:13.049576    3673 out.go:177] 
	
	
	==> Docker <==
	Jul 31 17:01:03 ha-393000 dockerd[1182]: time="2024-07-31T17:01:03.955592926Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:25 ha-393000 dockerd[1176]: time="2024-07-31T17:01:25.077687038Z" level=info msg="ignoring event" container=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.078615114Z" level=info msg="shim disconnected" id=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 namespace=moby
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.079358383Z" level=warning msg="cleaning up after shim disconnected" id=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 namespace=moby
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.079401742Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1176]: time="2024-07-31T17:01:26.090101886Z" level=info msg="ignoring event" container=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090726839Z" level=info msg="shim disconnected" id=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090779643Z" level=warning msg="cleaning up after shim disconnected" id=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090788476Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149678222Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149721539Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149733538Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149849844Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.145857339Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146175734Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146320206Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146525965Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:02:03 ha-393000 dockerd[1176]: time="2024-07-31T17:02:03.637026559Z" level=info msg="ignoring event" container=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.636863270Z" level=info msg="shim disconnected" id=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb namespace=moby
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.637470771Z" level=warning msg="cleaning up after shim disconnected" id=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb namespace=moby
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.637576887Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770900358Z" level=info msg="shim disconnected" id=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770966496Z" level=warning msg="cleaning up after shim disconnected" id=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770975588Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1176]: time="2024-07-31T17:02:11.771381326Z" level=info msg="ignoring event" container=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	dd8064769032d       76932a3b37d7e                                                                                         25 seconds ago       Exited              kube-controller-manager   2                   626ea84aade06       kube-controller-manager-ha-393000
	e557dfd18a90c       1f6d574d502f3                                                                                         33 seconds ago       Exited              kube-apiserver            2                   194073f1c5ac9       kube-apiserver-ha-393000
	86018b08bbaa1       3861cfcd7c04c                                                                                         About a minute ago   Running             etcd                      1                   ba75e4f4299bf       etcd-ha-393000
	5fcb6f7d8ab78       38af8ddebf499                                                                                         About a minute ago   Running             kube-vip                  0                   e6198932cc027       kube-vip-ha-393000
	d088fefe5f8e3       3edc18e7b7672                                                                                         About a minute ago   Running             kube-scheduler            1                   f04a7ecd568d2       kube-scheduler-ha-393000
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   5 minutes ago        Exited              busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         7 minutes ago        Exited              coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         7 minutes ago        Exited              coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	6d966e37d3618       6e38f40d628db                                                                                         7 minutes ago        Exited              storage-provisioner       0                   25b3d6db405f4       storage-provisioner
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              8 minutes ago        Exited              kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         8 minutes ago        Exited              kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	e68314e525ef8       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     8 minutes ago        Exited              kube-vip                  0                   c9f21d49b1384       kube-vip-ha-393000
	63e56744c84ee       3861cfcd7c04c                                                                                         8 minutes ago        Exited              etcd                      0                   f8f20b1290499       etcd-ha-393000
	65412448c586b       3edc18e7b7672                                                                                         8 minutes ago        Exited              kube-scheduler            0                   7ab9affa89eca       kube-scheduler-ha-393000
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [feda36fb8a03] <==
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E0731 17:02:17.335366    2770 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:17.335771    2770 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:17.337362    2770 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:17.337629    2770 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:17.338992    2770 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[  +0.000000] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000001] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.035932] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.007966] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.670742] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007034] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.684430] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.291972] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000007] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +2.476130] systemd-fstab-generator[479]: Ignoring "noauto" option for root device
	[  +0.098443] systemd-fstab-generator[491]: Ignoring "noauto" option for root device
	[  +1.350473] kauditd_printk_skb: 42 callbacks suppressed
	[  +0.720403] systemd-fstab-generator[1105]: Ignoring "noauto" option for root device
	[  +0.253450] systemd-fstab-generator[1142]: Ignoring "noauto" option for root device
	[  +0.122356] systemd-fstab-generator[1154]: Ignoring "noauto" option for root device
	[  +0.120102] systemd-fstab-generator[1168]: Ignoring "noauto" option for root device
	[  +2.456184] systemd-fstab-generator[1382]: Ignoring "noauto" option for root device
	[  +0.101772] systemd-fstab-generator[1394]: Ignoring "noauto" option for root device
	[  +0.097595] systemd-fstab-generator[1406]: Ignoring "noauto" option for root device
	[  +0.132878] systemd-fstab-generator[1422]: Ignoring "noauto" option for root device
	[  +0.441269] systemd-fstab-generator[1586]: Ignoring "noauto" option for root device
	[Jul31 17:01] kauditd_printk_skb: 271 callbacks suppressed
	[ +21.458149] kauditd_printk_skb: 40 callbacks suppressed
	
	
	==> etcd [63e56744c84e] <==
	{"level":"info","ts":"2024-07-31T17:00:28.797806Z","caller":"traceutil/trace.go:171","msg":"trace[2004834865] range","detail":"{range_begin:/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath; range_end:; }","duration":"7.769094578s","start":"2024-07-31T17:00:21.028708Z","end":"2024-07-31T17:00:28.797803Z","steps":["trace[2004834865] 'agreement among raft nodes before linearized reading'  (duration: 7.769083488s)"],"step_count":1}
	{"level":"warn","ts":"2024-07-31T17:00:28.797814Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-07-31T17:00:21.028699Z","time spent":"7.769113485s","remote":"127.0.0.1:48488","response type":"/etcdserverpb.KV/Range","request count":0,"request size":67,"response count":0,"response size":0,"request content":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" "}
	2024/07/31 17:00:28 WARNING: [core] [Server #5] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-07-31T17:00:28.797862Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-07-31T17:00:27.051463Z","time spent":"1.746397856s","remote":"127.0.0.1:48588","response type":"/etcdserverpb.KV/Txn","request count":0,"request size":0,"response count":0,"response size":0,"request content":""}
	2024/07/31 17:00:28 WARNING: [core] [Server #5] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-07-31T17:00:28.892734Z","caller":"embed/serve.go:212","msg":"stopping secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"warn","ts":"2024-07-31T17:00:28.892779Z","caller":"embed/serve.go:214","msg":"stopped secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"info","ts":"2024-07-31T17:00:28.892813Z","caller":"etcdserver/server.go:1462","msg":"skipped leadership transfer; local server is not leader","local-member-id":"b8c6c7563d17d844","current-leader-member-id":"0"}
	{"level":"info","ts":"2024-07-31T17:00:28.893872Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.89389Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.893909Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894003Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894029Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894052Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.89406Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894064Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894069Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894081Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894256Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.89428Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894301Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.89431Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.900461Z","caller":"embed/etcd.go:579","msg":"stopping serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-07-31T17:00:28.900567Z","caller":"embed/etcd.go:584","msg":"stopped serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-07-31T17:00:28.900576Z","caller":"embed/etcd.go:377","msg":"closed etcd server","name":"ha-393000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.5:2380"],"advertise-client-urls":["https://192.169.0.5:2379"]}
	
	
	==> etcd [86018b08bbaa] <==
	{"level":"info","ts":"2024-07-31T17:02:10.908402Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:10.908419Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:10.908429Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706453Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706497Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706792Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706867Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:12.706926Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:02:14.336186Z","caller":"etcdserver/server.go:2089","msg":"failed to publish local member to cluster through raft","local-member-id":"b8c6c7563d17d844","local-member-attributes":"{Name:ha-393000 ClientURLs:[https://192.169.0.5:2379]}","request-path":"/0/members/b8c6c7563d17d844/attributes","publish-timeout":"7s","error":"etcdserver: request timed out"}
	{"level":"warn","ts":"2024-07-31T17:02:14.364851Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:02:14.364914Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: i/o timeout"}
	{"level":"warn","ts":"2024-07-31T17:02:14.364925Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: i/o timeout"}
	{"level":"warn","ts":"2024-07-31T17:02:14.364923Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"info","ts":"2024-07-31T17:02:14.507119Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507183Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507199Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507209Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:02:15.5398Z","caller":"etcdhttp/health.go:232","msg":"serving /health false; no leader"}
	{"level":"warn","ts":"2024-07-31T17:02:15.539893Z","caller":"etcdhttp/health.go:119","msg":"/health error","output":"{\"health\":\"false\",\"reason\":\"RAFT NO LEADER\"}","status-code":503}
	{"level":"info","ts":"2024-07-31T17:02:16.306371Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:16.306422Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:16.306447Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:16.306457Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:16.306463Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	
	
	==> kernel <==
	 17:02:17 up 1 min,  0 users,  load average: 0.17, 0.09, 0.03
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:59:40.110698       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:50.118349       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:50.118427       1 main.go:299] handling current node
	I0731 16:59:50.118450       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:50.118464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:50.118651       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:50.118739       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.118883       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:00.118987       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:00.119126       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:00.119236       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.119356       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:00.119483       1 main.go:299] handling current node
	I0731 17:00:10.110002       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:10.111054       1 main.go:299] handling current node
	I0731 17:00:10.111286       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:10.111319       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:10.111445       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:10.111480       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:20.116250       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:20.116442       1 main.go:299] handling current node
	I0731 17:00:20.116458       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:20.116464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:20.116608       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:20.116672       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [e557dfd18a90] <==
	I0731 17:01:43.256034       1 options.go:221] external host was not specified, using 192.169.0.5
	I0731 17:01:43.256765       1 server.go:148] Version: v1.30.3
	I0731 17:01:43.256805       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:01:43.612164       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0731 17:01:43.614459       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:01:43.616925       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0731 17:01:43.616979       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0731 17:01:43.617139       1 instance.go:299] Using reconciler: lease
	W0731 17:02:03.611949       1 logging.go:59] [core] [Channel #2 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0731 17:02:03.612150       1 logging.go:59] [core] [Channel #1 SubChannel #3] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0731 17:02:03.618964       1 instance.go:292] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-controller-manager [dd8064769032] <==
	I0731 17:01:51.487714       1 serving.go:380] Generated self-signed cert in-memory
	I0731 17:01:51.747070       1 controllermanager.go:189] "Starting" version="v1.30.3"
	I0731 17:01:51.747213       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:01:51.750433       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0731 17:01:51.750809       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:01:51.750880       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:01:51.750924       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0731 17:02:11.753015       1 controllermanager.go:234] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.169.0.5:8443/healthz\": dial tcp 192.169.0.5:8443: connect: connection refused"
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [65412448c586] <==
	E0731 16:53:48.491132       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0731 16:53:48.491335       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:48.491387       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0731 16:53:48.491507       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0731 16:53:48.491594       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0731 16:53:48.491662       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:48.491738       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:48.491818       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:48.491860       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:48.491537       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:48.491873       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.319781       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0731 16:53:49.319838       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0731 16:53:49.326442       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.326478       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.392116       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:49.392172       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:49.496014       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.496036       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.541411       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:49.541927       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:49.588695       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:49.588735       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0731 16:53:49.982415       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0731 17:00:28.842533       1 run.go:74] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [d088fefe5f8e] <==
	Trace[36189415]: ---"Objects listed" error:Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (17:01:57.191)
	Trace[36189415]: [10.000695567s] [10.000695567s] END
	E0731 17:01:57.191818       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": net/http: TLS handshake timeout
	W0731 17:02:04.142372       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:04.142613       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:04.628727       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55278->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.628988       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55278->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.628727       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55294->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629137       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55294->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.629274       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55260->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629350       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55260->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.629024       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: Get "https://192.169.0.5:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55272->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629603       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://192.169.0.5:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55272->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:05.172498       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:05.172740       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:14.697016       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:14.697068       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:15.773865       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: Get "https://192.169.0.5:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:15.773914       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.169.0.5:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:16.600223       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:16.600272       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:17.841327       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:17.841376       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:18.163867       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: Get "https://192.169.0.5:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:18.163910       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.169.0.5:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	
	
	==> kubelet <==
	Jul 31 17:01:57 ha-393000 kubelet[1593]: E0731 17:01:57.152916    1593 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ha-393000\" not found"
	Jul 31 17:01:58 ha-393000 kubelet[1593]: E0731 17:01:58.501133    1593 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://control-plane.minikube.internal:8443/api/v1/namespaces/default/events\": dial tcp 192.169.0.254:8443: connect: no route to host" event="&Event{ObjectMeta:{ha-393000.17e75ad5dc50e2da  default    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ha-393000,UID:ha-393000,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ha-393000,},FirstTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,LastTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ha-393000,}"
	Jul 31 17:02:02 ha-393000 kubelet[1593]: I0731 17:02:02.431480    1593 kubelet_node_status.go:73] "Attempting to register node" node="ha-393000"
	Jul 31 17:02:03 ha-393000 kubelet[1593]: I0731 17:02:03.874945    1593 scope.go:117] "RemoveContainer" containerID="8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7"
	Jul 31 17:02:03 ha-393000 kubelet[1593]: I0731 17:02:03.875815    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:03 ha-393000 kubelet[1593]: E0731 17:02:03.876093    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:04 ha-393000 kubelet[1593]: E0731 17:02:04.641093    1593 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://control-plane.minikube.internal:8443/api/v1/nodes\": dial tcp 192.169.0.254:8443: connect: no route to host" node="ha-393000"
	Jul 31 17:02:04 ha-393000 kubelet[1593]: E0731 17:02:04.641160    1593 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://control-plane.minikube.internal:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ha-393000?timeout=10s\": dial tcp 192.169.0.254:8443: connect: no route to host" interval="7s"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: E0731 17:02:07.161123    1593 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ha-393000\" not found"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: I0731 17:02:07.713966    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: E0731 17:02:07.714569    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: I0731 17:02:10.692480    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: E0731 17:02:10.693256    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: E0731 17:02:10.785079    1593 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://control-plane.minikube.internal:8443/api/v1/namespaces/default/events\": dial tcp 192.169.0.254:8443: connect: no route to host" event="&Event{ObjectMeta:{ha-393000.17e75ad5dc50e2da  default    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ha-393000,UID:ha-393000,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ha-393000,},FirstTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,LastTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ha-393000,}"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.644681    1593 kubelet_node_status.go:73] "Attempting to register node" node="ha-393000"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.957788    1593 scope.go:117] "RemoveContainer" containerID="375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.958478    1593 scope.go:117] "RemoveContainer" containerID="dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: E0731 17:02:11.958720    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-ha-393000_kube-system(ae5c50a5b151d76ab8b2e88315db2b23)\"" pod="kube-system/kube-controller-manager-ha-393000" podUID="ae5c50a5b151d76ab8b2e88315db2b23"
	Jul 31 17:02:13 ha-393000 kubelet[1593]: W0731 17:02:13.855738    1593 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://control-plane.minikube.internal:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.254:8443: connect: no route to host
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855784    1593 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://control-plane.minikube.internal:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.254:8443: connect: no route to host
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855832    1593 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://control-plane.minikube.internal:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ha-393000?timeout=10s\": dial tcp 192.169.0.254:8443: connect: no route to host" interval="7s"
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855916    1593 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://control-plane.minikube.internal:8443/api/v1/nodes\": dial tcp 192.169.0.254:8443: connect: no route to host" node="ha-393000"
	Jul 31 17:02:15 ha-393000 kubelet[1593]: I0731 17:02:15.731620    1593 scope.go:117] "RemoveContainer" containerID="dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222"
	Jul 31 17:02:15 ha-393000 kubelet[1593]: E0731 17:02:15.731858    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-ha-393000_kube-system(ae5c50a5b151d76ab8b2e88315db2b23)\"" pod="kube-system/kube-controller-manager-ha-393000" podUID="ae5c50a5b151d76ab8b2e88315db2b23"
	Jul 31 17:02:17 ha-393000 kubelet[1593]: E0731 17:02:17.168732    1593 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ha-393000\" not found"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:254: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000: exit status 2 (146.71916ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:254: status error: exit status 2 (may be ok)
helpers_test.go:256: "ha-393000" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestMultiControlPlane/serial/DeleteSecondaryNode (2.92s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (2.8s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
ha_test.go:413: expected profile "ha-393000" in json of 'profile list' to have "Degraded" status but have "Stopped" status. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-393000\",\"Status\":\"Stopped\",\"Config\":{\"Name\":\"ha-393000\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"hyperkit\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACoun
t\":1,\"APIServerPort\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.3\",\"ClusterName\":\"ha-393000\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.169.0.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"docker\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.169.0.5\",\"Port\":8443,\"Ku
bernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.169.0.6\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.169.0.7\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m04\",\"IP\":\"192.169.0.8\",\"Port\":0,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":false,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\"
:false,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-device-plugin\":false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/Users:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP
\":\"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOptimizations\":false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-darwin-amd64 profile list --output json"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000: exit status 2 (148.236349ms)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:239: status error: exit status 2 (may be ok)
helpers_test.go:244: <<< TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (2.216489542s)
helpers_test.go:252: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node start m02 -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:59 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000 -v=7               | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-393000 -v=7                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT | 31 Jul 24 10:00 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	| node    | ha-393000 node delete m03 -v=7       | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 10:00:36
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 10:00:36.495656    3673 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:00:36.495846    3673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:36.495851    3673 out.go:304] Setting ErrFile to fd 2...
	I0731 10:00:36.495855    3673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:00:36.496034    3673 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:00:36.497502    3673 out.go:298] Setting JSON to false
	I0731 10:00:36.520791    3673 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1806,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:00:36.520876    3673 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:00:36.543180    3673 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:00:36.586807    3673 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:00:36.586859    3673 notify.go:220] Checking for updates...
	I0731 10:00:36.634547    3673 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:36.676763    3673 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:00:36.720444    3673 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:00:36.764479    3673 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:00:36.807390    3673 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:00:36.829325    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:36.829489    3673 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:00:36.830157    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.830242    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:36.839843    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51805
	I0731 10:00:36.840174    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:36.840633    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:36.840652    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:36.840857    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:36.840969    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:36.869457    3673 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:00:36.911576    3673 start.go:297] selected driver: hyperkit
	I0731 10:00:36.911605    3673 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:fals
e efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p20
00.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:36.911854    3673 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:00:36.912050    3673 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:00:36.912259    3673 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:00:36.921863    3673 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:00:36.925765    3673 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.925786    3673 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:00:36.929097    3673 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:00:36.929172    3673 cni.go:84] Creating CNI manager for ""
	I0731 10:00:36.929182    3673 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:00:36.929256    3673 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:36.929358    3673 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:00:36.971579    3673 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:00:36.992648    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:36.992745    3673 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:00:36.992770    3673 cache.go:56] Caching tarball of preloaded images
	I0731 10:00:36.992959    3673 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:00:36.992977    3673 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:00:36.993162    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:36.993985    3673 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:00:36.994105    3673 start.go:364] duration metric: took 96.51µs to acquireMachinesLock for "ha-393000"
	I0731 10:00:36.994138    3673 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:00:36.994154    3673 fix.go:54] fixHost starting: 
	I0731 10:00:36.994597    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:36.994672    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:37.003590    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51807
	I0731 10:00:37.003945    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:37.004312    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:37.004326    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:37.004582    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:37.004711    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:37.004817    3673 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:00:37.004901    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.004979    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 2965
	I0731 10:00:37.005943    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 2965 missing from process table
	I0731 10:00:37.005965    3673 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:00:37.005979    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:00:37.006061    3673 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:00:37.048650    3673 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:00:37.069570    3673 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:00:37.069827    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.069877    3673 main.go:141] libmachine: (ha-393000) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:00:37.071916    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 2965 missing from process table
	I0731 10:00:37.071929    3673 main.go:141] libmachine: (ha-393000) DBG | pid 2965 is in state "Stopped"
	I0731 10:00:37.071946    3673 main.go:141] libmachine: (ha-393000) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid...
	I0731 10:00:37.072316    3673 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:00:37.238669    3673 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:00:37.238692    3673 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:00:37.238840    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c1020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:37.238867    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c1020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:37.238912    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:00:37.238957    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:00:37.238973    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:00:37.240553    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 DEBUG: hyperkit: Pid is 3685
	I0731 10:00:37.240991    3673 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:00:37.241011    3673 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:37.241081    3673 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:00:37.243087    3673 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:00:37.243212    3673 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:00:37.243230    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:00:37.243264    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbebe}
	I0731 10:00:37.243290    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:00:37.243299    3673 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:00:37.243315    3673 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbd86}
	I0731 10:00:37.243326    3673 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:00:37.243339    3673 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:00:37.243975    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:37.244206    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:37.244644    3673 machine.go:94] provisionDockerMachine start ...
	I0731 10:00:37.244655    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:37.244765    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:37.244869    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:37.244966    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:37.245080    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:37.245168    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:37.245280    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:37.245490    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:37.245498    3673 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:00:37.248660    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:00:37.300309    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:00:37.301000    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:37.301012    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:37.301019    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:37.301029    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:37.684614    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:00:37.684630    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:00:37.799569    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:37.799605    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:37.799644    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:37.799683    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:37.800441    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:00:37.800452    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:37 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:00:43.367703    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:00:43.367775    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:00:43.367785    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:00:43.391726    3673 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:00:43 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:00:47.223864    3673 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:00:50.289047    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:00:50.289062    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.289203    3673 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:00:50.289213    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.289308    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.289398    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.289487    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.289585    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.289691    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.289830    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.289999    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.290007    3673 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:00:50.363752    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:00:50.363772    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.363906    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.364014    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.364093    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.364179    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.364291    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.364432    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.364443    3673 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:00:50.433878    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:00:50.433898    3673 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:00:50.433922    3673 buildroot.go:174] setting up certificates
	I0731 10:00:50.433931    3673 provision.go:84] configureAuth start
	I0731 10:00:50.433938    3673 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:00:50.434079    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:50.434192    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.434286    3673 provision.go:143] copyHostCerts
	I0731 10:00:50.434327    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:00:50.434402    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:00:50.434411    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:00:50.434544    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:00:50.434743    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:00:50.434783    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:00:50.434794    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:00:50.434879    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:00:50.435018    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:00:50.435058    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:00:50.435063    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:00:50.435178    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:00:50.435321    3673 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:00:50.506730    3673 provision.go:177] copyRemoteCerts
	I0731 10:00:50.506778    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:00:50.506790    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.506910    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.507000    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.507081    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.507175    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:50.545550    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:00:50.545628    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:00:50.565303    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:00:50.565359    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:00:50.584957    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:00:50.585022    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:00:50.604884    3673 provision.go:87] duration metric: took 170.940154ms to configureAuth
	I0731 10:00:50.604897    3673 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:00:50.605065    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:50.605078    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:50.605206    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.605298    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.605377    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.605465    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.605532    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.605631    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.605760    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.605768    3673 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:00:50.667182    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:00:50.667193    3673 buildroot.go:70] root file system type: tmpfs
	I0731 10:00:50.667267    3673 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:00:50.667284    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.667424    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.667511    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.667602    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.667687    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.667814    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.668002    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.668046    3673 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:00:50.740959    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:00:50.740978    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:50.741112    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:50.741203    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.741293    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:50.741371    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:50.741505    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:50.741655    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:50.741668    3673 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:00:52.502025    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:00:52.502040    3673 machine.go:97] duration metric: took 15.257390709s to provisionDockerMachine
	I0731 10:00:52.502051    3673 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:00:52.502059    3673 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:00:52.502069    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.502248    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:00:52.502270    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.502370    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.502470    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.502555    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.502643    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.547198    3673 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:00:52.551739    3673 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:00:52.551756    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:00:52.551866    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:00:52.552049    3673 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:00:52.552056    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:00:52.552268    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:00:52.560032    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:00:52.588419    3673 start.go:296] duration metric: took 86.358285ms for postStartSetup
	I0731 10:00:52.588445    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.588627    3673 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:00:52.588639    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.588726    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.588826    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.588924    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.589019    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.626220    3673 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:00:52.626280    3673 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:00:52.679056    3673 fix.go:56] duration metric: took 15.684908032s for fixHost
	I0731 10:00:52.679077    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.679217    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.679309    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.679414    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.679512    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.679640    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:52.679796    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:00:52.679804    3673 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:00:52.743374    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445252.855355235
	
	I0731 10:00:52.743388    3673 fix.go:216] guest clock: 1722445252.855355235
	I0731 10:00:52.743395    3673 fix.go:229] Guest: 2024-07-31 10:00:52.855355235 -0700 PDT Remote: 2024-07-31 10:00:52.679067 -0700 PDT m=+16.220078068 (delta=176.288235ms)
	I0731 10:00:52.743428    3673 fix.go:200] guest clock delta is within tolerance: 176.288235ms
	I0731 10:00:52.743433    3673 start.go:83] releasing machines lock for "ha-393000", held for 15.749318892s
	I0731 10:00:52.743452    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.743591    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:52.743689    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.743983    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.744104    3673 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:00:52.744194    3673 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:00:52.744225    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.744256    3673 ssh_runner.go:195] Run: cat /version.json
	I0731 10:00:52.744267    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:00:52.744309    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.744357    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:00:52.744392    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.744456    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:00:52.744481    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.744548    3673 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:00:52.744567    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.744621    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:00:52.782140    3673 ssh_runner.go:195] Run: systemctl --version
	I0731 10:00:52.830138    3673 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:00:52.835167    3673 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:00:52.835215    3673 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:00:52.850271    3673 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:00:52.850283    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:00:52.850381    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:00:52.866468    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:00:52.875484    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:00:52.884391    3673 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:00:52.884442    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:00:52.893278    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:00:52.902262    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:00:52.911487    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:00:52.920398    3673 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:00:52.929249    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:00:52.938174    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:00:52.947153    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:00:52.956120    3673 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:00:52.964110    3673 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:00:52.972137    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:53.078103    3673 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:00:53.097675    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:00:53.097756    3673 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:00:53.112446    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:00:53.127406    3673 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:00:53.144131    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:00:53.155196    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:00:53.165334    3673 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:00:53.191499    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:00:53.201932    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:00:53.215879    3673 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:00:53.218826    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:00:53.226091    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:00:53.239772    3673 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:00:53.345187    3673 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:00:53.464204    3673 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:00:53.464287    3673 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:00:53.478304    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:53.574350    3673 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:00:55.900933    3673 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.32656477s)
	I0731 10:00:55.900999    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:00:55.912322    3673 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:00:55.925824    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:00:55.936801    3673 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:00:56.031696    3673 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:00:56.126413    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.241279    3673 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:00:56.258174    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:00:56.269382    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.365935    3673 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:00:56.430017    3673 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:00:56.430103    3673 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:00:56.435941    3673 start.go:563] Will wait 60s for crictl version
	I0731 10:00:56.435987    3673 ssh_runner.go:195] Run: which crictl
	I0731 10:00:56.439039    3673 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:00:56.464351    3673 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:00:56.464431    3673 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:00:56.482582    3673 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:00:56.520920    3673 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:00:56.520980    3673 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:00:56.521425    3673 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:00:56.525996    3673 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:00:56.535672    3673 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:00:56.535754    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:56.535821    3673 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:00:56.552650    3673 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:00:56.552662    3673 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:00:56.552737    3673 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:00:56.566645    3673 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:00:56.566662    3673 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:00:56.566671    3673 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:00:56.566751    3673 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:00:56.566818    3673 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:00:56.602960    3673 cni.go:84] Creating CNI manager for ""
	I0731 10:00:56.602973    3673 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:00:56.602987    3673 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:00:56.603002    3673 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:00:56.603093    3673 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:00:56.603113    3673 kube-vip.go:115] generating kube-vip config ...
	I0731 10:00:56.603160    3673 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:00:56.615328    3673 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:00:56.615398    3673 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:00:56.615448    3673 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:00:56.624876    3673 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:00:56.624925    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:00:56.632950    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:00:56.646343    3673 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:00:56.659992    3673 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:00:56.674079    3673 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:00:56.687863    3673 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:00:56.690766    3673 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:00:56.700932    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:00:56.802080    3673 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:00:56.817267    3673 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:00:56.817279    3673 certs.go:194] generating shared ca certs ...
	I0731 10:00:56.817290    3673 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.817479    3673 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:00:56.817554    3673 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:00:56.817564    3673 certs.go:256] generating profile certs ...
	I0731 10:00:56.817680    3673 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:00:56.817703    3673 certs.go:363] generating signed profile cert for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:00:56.817718    3673 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.169.0.5 192.169.0.6 192.169.0.7 192.169.0.254]
	I0731 10:00:56.884314    3673 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e ...
	I0731 10:00:56.884330    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e: {Name:mk4c6f4a11277f3afefbfb19687b3bc0d7252c4a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.884782    3673 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e ...
	I0731 10:00:56.884793    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e: {Name:mk5943238cbce29d53e24742be6a5a17eba24882 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:56.885016    3673 certs.go:381] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt.b87e2e7e -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt
	I0731 10:00:56.885227    3673 certs.go:385] copying /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e -> /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key
	I0731 10:00:56.885483    3673 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:00:56.885493    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:00:56.885517    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:00:56.885538    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:00:56.885559    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:00:56.885578    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:00:56.885598    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:00:56.885616    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:00:56.885638    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:00:56.885737    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:00:56.885783    3673 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:00:56.885791    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:00:56.885821    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:00:56.885850    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:00:56.885884    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:00:56.885950    3673 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:00:56.885985    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:00:56.886006    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:56.886025    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:00:56.886457    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:00:56.908157    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:00:56.929396    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:00:56.960758    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:00:56.990308    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:00:57.032527    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:00:57.067819    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:00:57.106993    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:00:57.128481    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:00:57.148005    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:00:57.167023    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:00:57.187176    3673 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:00:57.200671    3673 ssh_runner.go:195] Run: openssl version
	I0731 10:00:57.204930    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:00:57.213254    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.216716    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.216751    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:00:57.221199    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:00:57.229492    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:00:57.237806    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.241238    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.241272    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:00:57.245487    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:00:57.253849    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:00:57.262176    3673 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.265564    3673 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.265596    3673 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:00:57.269893    3673 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:00:57.278252    3673 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:00:57.281742    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:00:57.286355    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:00:57.290726    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:00:57.295185    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:00:57.299486    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:00:57.303679    3673 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:00:57.308178    3673 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:00:57.308287    3673 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:00:57.320531    3673 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:00:57.328175    3673 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:00:57.328185    3673 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:00:57.328220    3673 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:00:57.336122    3673 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:00:57.336458    3673 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.336540    3673 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:00:57.336737    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.337225    3673 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.337442    3673 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x4704660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:00:57.337765    3673 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:00:57.337950    3673 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:00:57.345178    3673 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:00:57.345192    3673 kubeadm.go:597] duration metric: took 17.003153ms to restartPrimaryControlPlane
	I0731 10:00:57.345197    3673 kubeadm.go:394] duration metric: took 37.024806ms to StartCluster
	I0731 10:00:57.345206    3673 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.345280    3673 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:00:57.345652    3673 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:00:57.345873    3673 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:00:57.345886    3673 start.go:241] waiting for startup goroutines ...
	I0731 10:00:57.345893    3673 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:00:57.346021    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:57.387797    3673 out.go:177] * Enabled addons: 
	I0731 10:00:57.409811    3673 addons.go:510] duration metric: took 63.912845ms for enable addons: enabled=[]
	I0731 10:00:57.409921    3673 start.go:246] waiting for cluster config update ...
	I0731 10:00:57.409935    3673 start.go:255] writing updated cluster config ...
	I0731 10:00:57.431766    3673 out.go:177] 
	I0731 10:00:57.453235    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:00:57.453375    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.475712    3673 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:00:57.517781    3673 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:00:57.517815    3673 cache.go:56] Caching tarball of preloaded images
	I0731 10:00:57.517989    3673 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:00:57.518009    3673 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:00:57.518145    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.519033    3673 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:00:57.519139    3673 start.go:364] duration metric: took 81.076µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:00:57.519165    3673 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:00:57.519174    3673 fix.go:54] fixHost starting: m02
	I0731 10:00:57.519609    3673 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:00:57.519636    3673 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:00:57.528542    3673 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51833
	I0731 10:00:57.528874    3673 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:00:57.529218    3673 main.go:141] libmachine: Using API Version  1
	I0731 10:00:57.529229    3673 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:00:57.529483    3673 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:00:57.529615    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:57.529706    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:00:57.529814    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.529894    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3205
	I0731 10:00:57.530866    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3205 missing from process table
	I0731 10:00:57.530885    3673 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:00:57.530896    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:00:57.530989    3673 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:00:57.572756    3673 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:00:57.593944    3673 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:00:57.594326    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.594377    3673 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:00:57.596218    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3205 missing from process table
	I0731 10:00:57.596230    3673 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3205 is in state "Stopped"
	I0731 10:00:57.596246    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:00:57.596604    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:00:57.624044    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:00:57.624071    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:00:57.624255    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000385aa0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:57.624298    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000385aa0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:00:57.624343    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:00:57.624392    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:00:57.624400    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:00:57.625747    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 DEBUG: hyperkit: Pid is 3703
	I0731 10:00:57.626235    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:00:57.626251    3673 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:00:57.626342    3673 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:00:57.627854    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:00:57.627953    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:00:57.627977    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abbf3e}
	I0731 10:00:57.628011    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:00:57.628034    3673 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbebe}
	I0731 10:00:57.628052    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:00:57.628055    3673 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:00:57.628122    3673 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:00:57.628751    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:00:57.628985    3673 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:00:57.629386    3673 machine.go:94] provisionDockerMachine start ...
	I0731 10:00:57.629397    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:00:57.629529    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:00:57.629660    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:00:57.629776    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:57.629863    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:00:57.629968    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:00:57.630086    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:00:57.630251    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:00:57.630259    3673 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:00:57.633394    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:00:57.642233    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:00:57.643220    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:57.643237    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:57.643246    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:57.643254    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:57 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:58.024328    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:00:58.024355    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:00:58.138946    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:00:58.138964    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:00:58.138972    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:00:58.138978    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:00:58.139795    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:00:58.139805    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:00:58 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:01:03.704070    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:01:03.704154    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:01:03.704165    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:01:03.727851    3673 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:01:03 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:01:08.691307    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:01:08.691320    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.691494    3673 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:01:08.691506    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.691592    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.691677    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:08.691764    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.691853    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.691954    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:08.692085    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:08.692236    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:08.692245    3673 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:01:08.755413    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:01:08.755429    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.755569    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:08.755667    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.755765    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:08.755854    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:08.755980    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:08.756132    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:08.756143    3673 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:01:08.818688    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:01:08.818702    3673 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:01:08.818713    3673 buildroot.go:174] setting up certificates
	I0731 10:01:08.818719    3673 provision.go:84] configureAuth start
	I0731 10:01:08.818725    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:01:08.818852    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:01:08.818933    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:08.819011    3673 provision.go:143] copyHostCerts
	I0731 10:01:08.819041    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:01:08.819091    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:01:08.819096    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:01:08.819226    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:01:08.819432    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:01:08.819463    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:01:08.819467    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:01:08.819545    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:01:08.819683    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:01:08.819712    3673 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:01:08.819716    3673 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:01:08.819792    3673 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:01:08.819938    3673 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:01:09.050116    3673 provision.go:177] copyRemoteCerts
	I0731 10:01:09.050171    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:01:09.050188    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.050328    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.050426    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.050517    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.050597    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:09.085881    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:01:09.085963    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:01:09.105721    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:01:09.105784    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:01:09.125488    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:01:09.125555    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:01:09.145164    3673 provision.go:87] duration metric: took 326.438057ms to configureAuth
	I0731 10:01:09.145176    3673 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:01:09.145335    3673 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:01:09.145348    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:09.145480    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.145573    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.145655    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.145735    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.145811    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.145938    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.146068    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.146076    3673 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:01:09.201832    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:01:09.201843    3673 buildroot.go:70] root file system type: tmpfs
	I0731 10:01:09.201923    3673 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:01:09.201934    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.202081    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.202179    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.202271    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.202354    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.202487    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.202618    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.202666    3673 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:01:09.267323    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:01:09.267343    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:09.267478    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:09.267567    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.267645    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:09.267729    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:09.267847    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:09.267982    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:09.267994    3673 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:01:10.914498    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:01:10.914513    3673 machine.go:97] duration metric: took 13.285120143s to provisionDockerMachine
	I0731 10:01:10.914520    3673 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:01:10.914527    3673 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:01:10.914537    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:10.914733    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:01:10.914747    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:10.914855    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:10.914953    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:10.915048    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:10.915144    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:10.959674    3673 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:01:10.963100    3673 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:01:10.963114    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:01:10.963203    3673 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:01:10.963349    3673 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:01:10.963357    3673 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:01:10.963530    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:01:10.972659    3673 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:01:11.000647    3673 start.go:296] duration metric: took 86.118358ms for postStartSetup
	I0731 10:01:11.000670    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.000870    3673 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:01:11.000882    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.001001    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.001098    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.001173    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.001251    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:11.035137    3673 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:01:11.035197    3673 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:01:11.088856    3673 fix.go:56] duration metric: took 13.569681658s for fixHost
	I0731 10:01:11.088895    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.089041    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.089136    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.089222    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.089315    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.089453    3673 main.go:141] libmachine: Using SSH client type: native
	I0731 10:01:11.089599    3673 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x325f0c0] 0x3261e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:01:11.089606    3673 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:01:11.145954    3673 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445271.149143027
	
	I0731 10:01:11.145966    3673 fix.go:216] guest clock: 1722445271.149143027
	I0731 10:01:11.145974    3673 fix.go:229] Guest: 2024-07-31 10:01:11.149143027 -0700 PDT Remote: 2024-07-31 10:01:11.088876 -0700 PDT m=+34.629889069 (delta=60.267027ms)
	I0731 10:01:11.145984    3673 fix.go:200] guest clock delta is within tolerance: 60.267027ms
	I0731 10:01:11.145988    3673 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.626840447s
	I0731 10:01:11.146004    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.146144    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:01:11.168821    3673 out.go:177] * Found network options:
	I0731 10:01:11.189411    3673 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:01:11.210445    3673 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:01:11.210484    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211359    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211621    3673 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:01:11.211769    3673 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:01:11.211807    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:01:11.211854    3673 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:01:11.211952    3673 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:01:11.211972    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:01:11.212003    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.212195    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:01:11.212236    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.212390    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:01:11.212455    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.212607    3673 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:01:11.212628    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:01:11.212741    3673 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:01:11.245224    3673 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:01:11.245288    3673 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:01:11.292468    3673 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:01:11.292485    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:01:11.292564    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:01:11.308790    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:01:11.317853    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:01:11.326752    3673 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:01:11.326791    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:01:11.335723    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:01:11.344565    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:01:11.353617    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:01:11.362526    3673 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:01:11.371536    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:01:11.380589    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:01:11.389630    3673 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:01:11.398848    3673 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:01:11.407046    3673 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:01:11.415065    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:01:11.507632    3673 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:01:11.526508    3673 start.go:495] detecting cgroup driver to use...
	I0731 10:01:11.526575    3673 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:01:11.541590    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:01:11.552707    3673 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:01:11.574170    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:01:11.585642    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:01:11.595961    3673 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:01:11.615167    3673 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:01:11.625493    3673 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:01:11.640509    3673 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:01:11.643540    3673 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:01:11.650600    3673 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:01:11.664458    3673 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:01:11.766555    3673 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:01:11.880513    3673 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:01:11.880542    3673 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:01:11.894469    3673 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:01:11.987172    3673 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:02:12.930966    3673 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1m0.943784713s)
	I0731 10:02:12.931036    3673 ssh_runner.go:195] Run: sudo journalctl --no-pager -u docker
	I0731 10:02:12.964792    3673 out.go:177] 
	W0731 10:02:12.986436    3673 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart docker: Process exited with status 1
	stdout:
	
	stderr:
	Job for docker.service failed because the control process exited with error code.
	See "systemctl status docker.service" and "journalctl -xeu docker.service" for details.
	
	sudo journalctl --no-pager -u docker:
	-- stdout --
	Jul 31 17:01:09 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.540836219Z" level=info msg="Starting up"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541317477Z" level=info msg="containerd not running, starting managed containerd"
	Jul 31 17:01:09 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:09.541838265Z" level=info msg="started new containerd process" address=/var/run/docker/containerd/containerd.sock module=libcontainerd pid=494
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.560371937Z" level=info msg="starting containerd" revision=8fc6bcff51318944179630522a095cc9dbf9f353 version=v1.7.20
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576586336Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576670079Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576715322Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576725763Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576901546Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.576942171Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577133168Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577170137Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577183696Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577195352Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577298762Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.577522478Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579179447Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.10.207\\n\"): skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579249243Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579392843Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/docker/containerd/daemon/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579426223Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579535672Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.579581480Z" level=info msg="metadata content store policy set" policy=shared
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581466765Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581512332Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581524910Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581534838Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581543733Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581622493Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581841090Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581949001Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581963875Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581973012Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.581991066Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582002817Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582011239Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582020290Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582029399Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582037767Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582045966Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582053831Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582071064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582081124Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582091080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582100077Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582108106Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582116349Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582123631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582131784Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582141489Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582150904Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582158314Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582166201Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582174064Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582185286Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582198762Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582207204Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582214973Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582263286Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582297170Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582306849Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582315043Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582321631Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582330079Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582340119Z" level=info msg="NRI interface is disabled by configuration."
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582481302Z" level=info msg=serving... address=/var/run/docker/containerd/containerd-debug.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582557809Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock.ttrpc
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582588544Z" level=info msg=serving... address=/var/run/docker/containerd/containerd.sock
	Jul 31 17:01:09 ha-393000-m02 dockerd[494]: time="2024-07-31T17:01:09.582634793Z" level=info msg="containerd successfully booted in 0.023010s"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.561555310Z" level=info msg="[graphdriver] trying configured driver: overlay2"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.591279604Z" level=info msg="Loading containers: start."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.773936432Z" level=warning msg="ip6tables is enabled, but cannot set up ip6tables chains" error="failed to create NAT chain DOCKER: iptables failed: ip6tables --wait -t nat -N DOCKER: ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)\nPerhaps ip6tables or your kernel needs to be upgraded.\n (exit status 3)"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.836555927Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.880452097Z" level=info msg="Loading containers: done."
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887082310Z" level=info msg="Docker daemon" commit=cc13f95 containerd-snapshotter=false storage-driver=overlay2 version=27.1.1
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.887241928Z" level=info msg="Daemon has completed initialization"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912027531Z" level=info msg="API listen on /var/run/docker.sock"
	Jul 31 17:01:10 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:10.912107549Z" level=info msg="API listen on [::]:2376"
	Jul 31 17:01:10 ha-393000-m02 systemd[1]: Started Docker Application Container Engine.
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.014698698Z" level=info msg="Processing signal 'terminated'"
	Jul 31 17:01:12 ha-393000-m02 systemd[1]: Stopping Docker Application Container Engine...
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.015851363Z" level=info msg="stopping event stream following graceful shutdown" error="<nil>" module=libcontainerd namespace=moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016218102Z" level=info msg="Daemon shutdown complete"
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016264206Z" level=info msg="stopping event stream following graceful shutdown" error="context canceled" module=libcontainerd namespace=plugins.moby
	Jul 31 17:01:12 ha-393000-m02 dockerd[487]: time="2024-07-31T17:01:12.016277600Z" level=info msg="stopping healthcheck following graceful shutdown" module=libcontainerd
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: docker.service: Deactivated successfully.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Stopped Docker Application Container Engine.
	Jul 31 17:01:13 ha-393000-m02 systemd[1]: Starting Docker Application Container Engine...
	Jul 31 17:01:13 ha-393000-m02 dockerd[1165]: time="2024-07-31T17:01:13.051074379Z" level=info msg="Starting up"
	Jul 31 17:02:13 ha-393000-m02 dockerd[1165]: failed to start daemon: failed to dial "/run/containerd/containerd.sock": failed to dial "/run/containerd/containerd.sock": context deadline exceeded
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Main process exited, code=exited, status=1/FAILURE
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: docker.service: Failed with result 'exit-code'.
	Jul 31 17:02:13 ha-393000-m02 systemd[1]: Failed to start Docker Application Container Engine.
	
	-- /stdout --
	W0731 10:02:12.986535    3673 out.go:239] * 
	W0731 10:02:12.987705    3673 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:02:13.049576    3673 out.go:177] 
	
	
	==> Docker <==
	Jul 31 17:01:03 ha-393000 dockerd[1182]: time="2024-07-31T17:01:03.955592926Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:25 ha-393000 dockerd[1176]: time="2024-07-31T17:01:25.077687038Z" level=info msg="ignoring event" container=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.078615114Z" level=info msg="shim disconnected" id=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 namespace=moby
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.079358383Z" level=warning msg="cleaning up after shim disconnected" id=8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7 namespace=moby
	Jul 31 17:01:25 ha-393000 dockerd[1182]: time="2024-07-31T17:01:25.079401742Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1176]: time="2024-07-31T17:01:26.090101886Z" level=info msg="ignoring event" container=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090726839Z" level=info msg="shim disconnected" id=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090779643Z" level=warning msg="cleaning up after shim disconnected" id=375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3 namespace=moby
	Jul 31 17:01:26 ha-393000 dockerd[1182]: time="2024-07-31T17:01:26.090788476Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149678222Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149721539Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149733538Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:43 ha-393000 dockerd[1182]: time="2024-07-31T17:01:43.149849844Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.145857339Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146175734Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146320206Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:01:51 ha-393000 dockerd[1182]: time="2024-07-31T17:01:51.146525965Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:02:03 ha-393000 dockerd[1176]: time="2024-07-31T17:02:03.637026559Z" level=info msg="ignoring event" container=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.636863270Z" level=info msg="shim disconnected" id=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb namespace=moby
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.637470771Z" level=warning msg="cleaning up after shim disconnected" id=e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb namespace=moby
	Jul 31 17:02:03 ha-393000 dockerd[1182]: time="2024-07-31T17:02:03.637576887Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770900358Z" level=info msg="shim disconnected" id=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770966496Z" level=warning msg="cleaning up after shim disconnected" id=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1182]: time="2024-07-31T17:02:11.770975588Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:02:11 ha-393000 dockerd[1176]: time="2024-07-31T17:02:11.771381326Z" level=info msg="ignoring event" container=dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	dd8064769032d       76932a3b37d7e                                                                                         28 seconds ago       Exited              kube-controller-manager   2                   626ea84aade06       kube-controller-manager-ha-393000
	e557dfd18a90c       1f6d574d502f3                                                                                         36 seconds ago       Exited              kube-apiserver            2                   194073f1c5ac9       kube-apiserver-ha-393000
	86018b08bbaa1       3861cfcd7c04c                                                                                         About a minute ago   Running             etcd                      1                   ba75e4f4299bf       etcd-ha-393000
	5fcb6f7d8ab78       38af8ddebf499                                                                                         About a minute ago   Running             kube-vip                  0                   e6198932cc027       kube-vip-ha-393000
	d088fefe5f8e3       3edc18e7b7672                                                                                         About a minute ago   Running             kube-scheduler            1                   f04a7ecd568d2       kube-scheduler-ha-393000
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   5 minutes ago        Exited              busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         7 minutes ago        Exited              coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         7 minutes ago        Exited              coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	6d966e37d3618       6e38f40d628db                                                                                         7 minutes ago        Exited              storage-provisioner       0                   25b3d6db405f4       storage-provisioner
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              8 minutes ago        Exited              kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         8 minutes ago        Exited              kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	e68314e525ef8       ghcr.io/kube-vip/kube-vip@sha256:360f0c5d02322075cc80edb9e4e0d2171e941e55072184f1f902203fafc81d0f     8 minutes ago        Exited              kube-vip                  0                   c9f21d49b1384       kube-vip-ha-393000
	63e56744c84ee       3861cfcd7c04c                                                                                         8 minutes ago        Exited              etcd                      0                   f8f20b1290499       etcd-ha-393000
	65412448c586b       3edc18e7b7672                                                                                         8 minutes ago        Exited              kube-scheduler            0                   7ab9affa89eca       kube-scheduler-ha-393000
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [feda36fb8a03] <==
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	command /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig" failed with error: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.3/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig": Process exited with status 1
	stdout:
	
	stderr:
	E0731 17:02:20.121393    2951 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:20.121695    2951 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:20.123208    2951 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:20.123387    2951 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	E0731 17:02:20.124658    2951 memcache.go:265] couldn't get current server API group list: Get "https://localhost:8443/api?timeout=32s": dial tcp 127.0.0.1:8443: connect: connection refused
	The connection to the server localhost:8443 was refused - did you specify the right host or port?
	
	
	==> dmesg <==
	[  +0.000000] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000001] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.035932] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.007966] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.670742] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007034] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.684430] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.291972] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000007] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +2.476130] systemd-fstab-generator[479]: Ignoring "noauto" option for root device
	[  +0.098443] systemd-fstab-generator[491]: Ignoring "noauto" option for root device
	[  +1.350473] kauditd_printk_skb: 42 callbacks suppressed
	[  +0.720403] systemd-fstab-generator[1105]: Ignoring "noauto" option for root device
	[  +0.253450] systemd-fstab-generator[1142]: Ignoring "noauto" option for root device
	[  +0.122356] systemd-fstab-generator[1154]: Ignoring "noauto" option for root device
	[  +0.120102] systemd-fstab-generator[1168]: Ignoring "noauto" option for root device
	[  +2.456184] systemd-fstab-generator[1382]: Ignoring "noauto" option for root device
	[  +0.101772] systemd-fstab-generator[1394]: Ignoring "noauto" option for root device
	[  +0.097595] systemd-fstab-generator[1406]: Ignoring "noauto" option for root device
	[  +0.132878] systemd-fstab-generator[1422]: Ignoring "noauto" option for root device
	[  +0.441269] systemd-fstab-generator[1586]: Ignoring "noauto" option for root device
	[Jul31 17:01] kauditd_printk_skb: 271 callbacks suppressed
	[ +21.458149] kauditd_printk_skb: 40 callbacks suppressed
	
	
	==> etcd [63e56744c84e] <==
	{"level":"info","ts":"2024-07-31T17:00:28.797806Z","caller":"traceutil/trace.go:171","msg":"trace[2004834865] range","detail":"{range_begin:/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath; range_end:; }","duration":"7.769094578s","start":"2024-07-31T17:00:21.028708Z","end":"2024-07-31T17:00:28.797803Z","steps":["trace[2004834865] 'agreement among raft nodes before linearized reading'  (duration: 7.769083488s)"],"step_count":1}
	{"level":"warn","ts":"2024-07-31T17:00:28.797814Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-07-31T17:00:21.028699Z","time spent":"7.769113485s","remote":"127.0.0.1:48488","response type":"/etcdserverpb.KV/Range","request count":0,"request size":67,"response count":0,"response size":0,"request content":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" "}
	2024/07/31 17:00:28 WARNING: [core] [Server #5] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-07-31T17:00:28.797862Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-07-31T17:00:27.051463Z","time spent":"1.746397856s","remote":"127.0.0.1:48588","response type":"/etcdserverpb.KV/Txn","request count":0,"request size":0,"response count":0,"response size":0,"request content":""}
	2024/07/31 17:00:28 WARNING: [core] [Server #5] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"warn","ts":"2024-07-31T17:00:28.892734Z","caller":"embed/serve.go:212","msg":"stopping secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"warn","ts":"2024-07-31T17:00:28.892779Z","caller":"embed/serve.go:214","msg":"stopped secure grpc server due to error","error":"accept tcp 192.169.0.5:2379: use of closed network connection"}
	{"level":"info","ts":"2024-07-31T17:00:28.892813Z","caller":"etcdserver/server.go:1462","msg":"skipped leadership transfer; local server is not leader","local-member-id":"b8c6c7563d17d844","current-leader-member-id":"0"}
	{"level":"info","ts":"2024-07-31T17:00:28.893872Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.89389Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.893909Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894003Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894029Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894052Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.89406Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"1c40d7bfcdf14e3b"}
	{"level":"info","ts":"2024-07-31T17:00:28.894064Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894069Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894081Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894256Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.89428Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.894301Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.89431Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:00:28.900461Z","caller":"embed/etcd.go:579","msg":"stopping serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-07-31T17:00:28.900567Z","caller":"embed/etcd.go:584","msg":"stopped serving peer traffic","address":"192.169.0.5:2380"}
	{"level":"info","ts":"2024-07-31T17:00:28.900576Z","caller":"embed/etcd.go:377","msg":"closed etcd server","name":"ha-393000","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.169.0.5:2380"],"advertise-client-urls":["https://192.169.0.5:2379"]}
	
	
	==> etcd [86018b08bbaa] <==
	{"level":"info","ts":"2024-07-31T17:02:14.507168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507183Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507199Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:14.507209Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:02:15.5398Z","caller":"etcdhttp/health.go:232","msg":"serving /health false; no leader"}
	{"level":"warn","ts":"2024-07-31T17:02:15.539893Z","caller":"etcdhttp/health.go:119","msg":"/health error","output":"{\"health\":\"false\",\"reason\":\"RAFT NO LEADER\"}","status-code":503}
	{"level":"info","ts":"2024-07-31T17:02:16.306371Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:16.306422Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:16.306447Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:16.306457Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:16.306463Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:18.106756Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:18.106804Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:18.10682Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:18.106838Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:18.106848Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:02:19.365027Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-07-31T17:02:19.365086Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:02:19.365097Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-07-31T17:02:19.365132Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"info","ts":"2024-07-31T17:02:19.907139Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:19.907223Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:19.907243Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:19.90726Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:02:19.907285Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	
	
	==> kernel <==
	 17:02:20 up 1 min,  0 users,  load average: 0.24, 0.10, 0.04
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:59:40.110698       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:50.118349       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:50.118427       1 main.go:299] handling current node
	I0731 16:59:50.118450       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:50.118464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:50.118651       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:50.118739       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.118883       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:00.118987       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:00.119126       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:00.119236       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.119356       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:00.119483       1 main.go:299] handling current node
	I0731 17:00:10.110002       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:10.111054       1 main.go:299] handling current node
	I0731 17:00:10.111286       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:10.111319       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:10.111445       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:10.111480       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:20.116250       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:20.116442       1 main.go:299] handling current node
	I0731 17:00:20.116458       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:20.116464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:20.116608       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:20.116672       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [e557dfd18a90] <==
	I0731 17:01:43.256034       1 options.go:221] external host was not specified, using 192.169.0.5
	I0731 17:01:43.256765       1 server.go:148] Version: v1.30.3
	I0731 17:01:43.256805       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:01:43.612164       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0731 17:01:43.614459       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:01:43.616925       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0731 17:01:43.616979       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0731 17:01:43.617139       1 instance.go:299] Using reconciler: lease
	W0731 17:02:03.611949       1 logging.go:59] [core] [Channel #2 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0731 17:02:03.612150       1 logging.go:59] [core] [Channel #1 SubChannel #3] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0731 17:02:03.618964       1 instance.go:292] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-controller-manager [dd8064769032] <==
	I0731 17:01:51.487714       1 serving.go:380] Generated self-signed cert in-memory
	I0731 17:01:51.747070       1 controllermanager.go:189] "Starting" version="v1.30.3"
	I0731 17:01:51.747213       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:01:51.750433       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0731 17:01:51.750809       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:01:51.750880       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:01:51.750924       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0731 17:02:11.753015       1 controllermanager.go:234] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.169.0.5:8443/healthz\": dial tcp 192.169.0.5:8443: connect: connection refused"
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [65412448c586] <==
	E0731 16:53:48.491132       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0731 16:53:48.491335       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:48.491387       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0731 16:53:48.491507       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0731 16:53:48.491594       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0731 16:53:48.491662       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:48.491738       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:48.491818       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:48.491860       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:48.491537       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:48.491873       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.319781       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0731 16:53:49.319838       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0731 16:53:49.326442       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.326478       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.392116       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0731 16:53:49.392172       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0731 16:53:49.496014       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0731 16:53:49.496036       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0731 16:53:49.541411       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0731 16:53:49.541927       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0731 16:53:49.588695       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0731 16:53:49.588735       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0731 16:53:49.982415       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0731 17:00:28.842533       1 run.go:74] "command failed" err="finished without leader elect"
	
	
	==> kube-scheduler [d088fefe5f8e] <==
	E0731 17:02:04.628988       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55278->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.628727       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55294->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629137       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55294->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.629274       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55260->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629350       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55260->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:04.629024       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolume: Get "https://192.169.0.5:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55272->192.169.0.5:8443: read: connection reset by peer
	E0731 17:02:04.629603       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://192.169.0.5:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused - error from a previous attempt: read tcp 192.169.0.5:55272->192.169.0.5:8443: read: connection reset by peer
	W0731 17:02:05.172498       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:05.172740       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:14.697016       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:14.697068       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:15.773865       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Namespace: Get "https://192.169.0.5:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:15.773914       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.169.0.5:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:16.600223       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:16.600272       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:17.841327       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:17.841376       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:18.163867       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicaSet: Get "https://192.169.0.5:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:18.163910       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.169.0.5:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:19.053316       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:19.053378       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:19.062180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:19.062230       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:02:19.143206       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:02:19.143254       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	
	
	==> kubelet <==
	Jul 31 17:02:03 ha-393000 kubelet[1593]: I0731 17:02:03.874945    1593 scope.go:117] "RemoveContainer" containerID="8bb07cb63f7ae5228a6d7adb86e3681edb56b4fd4b15ab7962709af8bddb2fc7"
	Jul 31 17:02:03 ha-393000 kubelet[1593]: I0731 17:02:03.875815    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:03 ha-393000 kubelet[1593]: E0731 17:02:03.876093    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:04 ha-393000 kubelet[1593]: E0731 17:02:04.641093    1593 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://control-plane.minikube.internal:8443/api/v1/nodes\": dial tcp 192.169.0.254:8443: connect: no route to host" node="ha-393000"
	Jul 31 17:02:04 ha-393000 kubelet[1593]: E0731 17:02:04.641160    1593 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://control-plane.minikube.internal:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ha-393000?timeout=10s\": dial tcp 192.169.0.254:8443: connect: no route to host" interval="7s"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: E0731 17:02:07.161123    1593 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ha-393000\" not found"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: I0731 17:02:07.713966    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:07 ha-393000 kubelet[1593]: E0731 17:02:07.714569    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: I0731 17:02:10.692480    1593 scope.go:117] "RemoveContainer" containerID="e557dfd18a90cdbb1bbe8105adc60e8a722b173c6c4bc1b23095a1ec366c37cb"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: E0731 17:02:10.693256    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver pod=kube-apiserver-ha-393000_kube-system(97d9112f0375227c852af24f4082cc7e)\"" pod="kube-system/kube-apiserver-ha-393000" podUID="97d9112f0375227c852af24f4082cc7e"
	Jul 31 17:02:10 ha-393000 kubelet[1593]: E0731 17:02:10.785079    1593 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://control-plane.minikube.internal:8443/api/v1/namespaces/default/events\": dial tcp 192.169.0.254:8443: connect: no route to host" event="&Event{ObjectMeta:{ha-393000.17e75ad5dc50e2da  default    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ha-393000,UID:ha-393000,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ha-393000,},FirstTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,LastTimestamp:2024-07-31 17:00:57.063326426 +0000 UTC m=+0.117315655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ha-393000,}"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.644681    1593 kubelet_node_status.go:73] "Attempting to register node" node="ha-393000"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.957788    1593 scope.go:117] "RemoveContainer" containerID="375b8cf2627384241404ef9132b75b8fecf39eea39fc6b2973782dfbe70c02f3"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: I0731 17:02:11.958478    1593 scope.go:117] "RemoveContainer" containerID="dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222"
	Jul 31 17:02:11 ha-393000 kubelet[1593]: E0731 17:02:11.958720    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-ha-393000_kube-system(ae5c50a5b151d76ab8b2e88315db2b23)\"" pod="kube-system/kube-controller-manager-ha-393000" podUID="ae5c50a5b151d76ab8b2e88315db2b23"
	Jul 31 17:02:13 ha-393000 kubelet[1593]: W0731 17:02:13.855738    1593 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://control-plane.minikube.internal:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.254:8443: connect: no route to host
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855784    1593 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://control-plane.minikube.internal:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.254:8443: connect: no route to host
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855832    1593 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://control-plane.minikube.internal:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ha-393000?timeout=10s\": dial tcp 192.169.0.254:8443: connect: no route to host" interval="7s"
	Jul 31 17:02:13 ha-393000 kubelet[1593]: E0731 17:02:13.855916    1593 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://control-plane.minikube.internal:8443/api/v1/nodes\": dial tcp 192.169.0.254:8443: connect: no route to host" node="ha-393000"
	Jul 31 17:02:15 ha-393000 kubelet[1593]: I0731 17:02:15.731620    1593 scope.go:117] "RemoveContainer" containerID="dd8064769032dce361add051c69a68a47f5ee39ba0624824174797a77d2a1222"
	Jul 31 17:02:15 ha-393000 kubelet[1593]: E0731 17:02:15.731858    1593 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-ha-393000_kube-system(ae5c50a5b151d76ab8b2e88315db2b23)\"" pod="kube-system/kube-controller-manager-ha-393000" podUID="ae5c50a5b151d76ab8b2e88315db2b23"
	Jul 31 17:02:17 ha-393000 kubelet[1593]: E0731 17:02:17.168732    1593 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ha-393000\" not found"
	Jul 31 17:02:20 ha-393000 kubelet[1593]: W0731 17:02:20.000256    1593 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://control-plane.minikube.internal:8443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.169.0.254:8443: connect: no route to host
	Jul 31 17:02:20 ha-393000 kubelet[1593]: E0731 17:02:20.000316    1593 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://control-plane.minikube.internal:8443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.169.0.254:8443: connect: no route to host
	Jul 31 17:02:20 ha-393000 kubelet[1593]: I0731 17:02:20.860445    1593 kubelet_node_status.go:73] "Attempting to register node" node="ha-393000"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:254: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000: exit status 2 (148.613912ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:254: status error: exit status 2 (may be ok)
helpers_test.go:256: "ha-393000" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (2.80s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (160.95s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:531: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 stop -v=7 --alsologtostderr
E0731 10:02:22.195155    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 10:03:27.205262    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 10:04:50.266087    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
ha_test.go:531: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 stop -v=7 --alsologtostderr: (2m40.779550904s)
ha_test.go:537: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:537: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 7 (105.565603ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-393000-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-393000-m03
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-393000-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:05:02.196865    3818 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:05:02.197068    3818 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.197073    3818 out.go:304] Setting ErrFile to fd 2...
	I0731 10:05:02.197077    3818 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.197250    3818 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:05:02.197451    3818 out.go:298] Setting JSON to false
	I0731 10:05:02.197471    3818 mustload.go:65] Loading cluster: ha-393000
	I0731 10:05:02.197508    3818 notify.go:220] Checking for updates...
	I0731 10:05:02.197791    3818 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:02.197806    3818 status.go:255] checking status of ha-393000 ...
	I0731 10:05:02.198132    3818 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.198181    3818 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.207052    3818 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51973
	I0731 10:05:02.207390    3818 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.207841    3818 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.207855    3818 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.208119    3818 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.208260    3818 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:02.208351    3818 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.208419    3818 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:05:02.209343    3818 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 3685 missing from process table
	I0731 10:05:02.209393    3818 status.go:330] ha-393000 host status = "Stopped" (err=<nil>)
	I0731 10:05:02.209400    3818 status.go:343] host is not running, skipping remaining checks
	I0731 10:05:02.209408    3818 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:05:02.209463    3818 status.go:255] checking status of ha-393000-m02 ...
	I0731 10:05:02.209704    3818 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.209751    3818 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.218072    3818 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51975
	I0731 10:05:02.218409    3818 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.218749    3818 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.218770    3818 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.219015    3818 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.219127    3818 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:05:02.219234    3818 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.219312    3818 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:05:02.225671    3818 status.go:330] ha-393000-m02 host status = "Stopped" (err=<nil>)
	I0731 10:05:02.225673    3818 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:02.225682    3818 status.go:343] host is not running, skipping remaining checks
	I0731 10:05:02.225689    3818 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:05:02.225700    3818 status.go:255] checking status of ha-393000-m03 ...
	I0731 10:05:02.225972    3818 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.225997    3818 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.234416    3818 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51977
	I0731 10:05:02.234770    3818 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.235086    3818 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.235095    3818 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.235331    3818 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.235465    3818 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:05:02.235554    3818 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.235627    3818 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:05:02.236558    3818 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:05:02.236575    3818 status.go:330] ha-393000-m03 host status = "Stopped" (err=<nil>)
	I0731 10:05:02.236580    3818 status.go:343] host is not running, skipping remaining checks
	I0731 10:05:02.236587    3818 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:05:02.236607    3818 status.go:255] checking status of ha-393000-m04 ...
	I0731 10:05:02.236865    3818 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.236888    3818 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.245219    3818 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51979
	I0731 10:05:02.245522    3818 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.245806    3818 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.245814    3818 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.246026    3818 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.246138    3818 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:05:02.246214    3818 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.246291    3818 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 10:05:02.247205    3818 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid 3095 missing from process table
	I0731 10:05:02.247231    3818 status.go:330] ha-393000-m04 host status = "Stopped" (err=<nil>)
	I0731 10:05:02.247238    3818 status.go:343] host is not running, skipping remaining checks
	I0731 10:05:02.247246    3818 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:543: status says not two control-plane nodes are present: args "out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr": ha-393000
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m03
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m04
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
ha_test.go:549: status says not three kubelets are stopped: args "out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr": ha-393000
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m03
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m04
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
ha_test.go:552: status says not two apiservers are stopped: args "out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr": ha-393000
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m03
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-393000-m04
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000: exit status 7 (66.581401ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "ha-393000" host is not running, skipping log retrieval (state="Stopped")
--- FAIL: TestMultiControlPlane/serial/StopCluster (160.95s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartCluster (375.27s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartCluster
ha_test.go:560: (dbg) Run:  out/minikube-darwin-amd64 start -p ha-393000 --wait=true -v=7 --alsologtostderr --driver=hyperkit 
E0731 10:06:54.504057    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 10:08:27.206422    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
ha_test.go:560: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p ha-393000 --wait=true -v=7 --alsologtostderr --driver=hyperkit : exit status 80 (6m10.38856837s)

                                                
                                                
-- stdout --
	* [ha-393000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	* Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	* Restarting existing hyperkit VM for "ha-393000" ...
	* Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	* Enabled addons: 
	
	* Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	* Restarting existing hyperkit VM for "ha-393000-m02" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5
	* Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	  - env NO_PROXY=192.169.0.5
	* Verifying Kubernetes components...
	
	* Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	* Restarting existing hyperkit VM for "ha-393000-m03" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5,192.169.0.6
	* Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	  - env NO_PROXY=192.169.0.5
	  - env NO_PROXY=192.169.0.5,192.169.0.6
	* Verifying Kubernetes components...
	
	* Starting "ha-393000-m04" worker node in "ha-393000" cluster
	* Restarting existing hyperkit VM for "ha-393000-m04" ...
	* Found network options:
	  - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	* Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	  - env NO_PROXY=192.169.0.5
	  - env NO_PROXY=192.169.0.5,192.169.0.6
	  - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	* Verifying Kubernetes components...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:05:02.368405    3827 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:05:02.368654    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368660    3827 out.go:304] Setting ErrFile to fd 2...
	I0731 10:05:02.368664    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368853    3827 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:05:02.370244    3827 out.go:298] Setting JSON to false
	I0731 10:05:02.392379    3827 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2072,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:05:02.392490    3827 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:05:02.414739    3827 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:05:02.457388    3827 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:05:02.457417    3827 notify.go:220] Checking for updates...
	I0731 10:05:02.499271    3827 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:02.520330    3827 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:05:02.541352    3827 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:05:02.562183    3827 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:05:02.583467    3827 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:05:02.605150    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:02.605829    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.605892    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.615374    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51985
	I0731 10:05:02.615746    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.616162    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.616171    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.616434    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.616563    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.616815    3827 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:05:02.617053    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.617075    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.625506    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51987
	I0731 10:05:02.625873    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.626205    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.626218    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.626409    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.626526    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.655330    3827 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:05:02.697472    3827 start.go:297] selected driver: hyperkit
	I0731 10:05:02.697517    3827 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclas
s:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersio
n:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.697705    3827 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:05:02.697830    3827 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.698011    3827 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:05:02.707355    3827 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:05:02.711327    3827 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.711347    3827 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:05:02.714056    3827 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:05:02.714115    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:02.714124    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:02.714208    3827 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.714310    3827 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.756588    3827 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:05:02.778505    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:02.778576    3827 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:05:02.778606    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:02.778797    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:02.778816    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:02.779007    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.779936    3827 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:02.780056    3827 start.go:364] duration metric: took 96.562µs to acquireMachinesLock for "ha-393000"
	I0731 10:05:02.780090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:02.780107    3827 fix.go:54] fixHost starting: 
	I0731 10:05:02.780518    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.780547    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.789537    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51989
	I0731 10:05:02.789941    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.790346    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.790360    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.790582    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.790683    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.790784    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:02.790882    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.790960    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:05:02.791917    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 3685 missing from process table
	I0731 10:05:02.791950    3827 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:05:02.791969    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:05:02.792054    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:02.834448    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:05:02.857592    3827 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:05:02.857865    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.857903    3827 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:05:02.857999    3827 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:05:02.972788    3827 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:05:02.972822    3827 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:02.973002    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973031    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973095    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:02.973143    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:02.973162    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:02.974700    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Pid is 3840
	I0731 10:05:02.975089    3827 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:05:02.975104    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.975174    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:02.977183    3827 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:05:02.977235    3827 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:02.977252    3827 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66aa6ebd}
	I0731 10:05:02.977264    3827 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:05:02.977271    3827 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:05:02.977358    3827 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:05:02.978043    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:02.978221    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.978639    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:02.978649    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.978783    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:02.978867    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:02.978959    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979081    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979169    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:02.979279    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:02.979484    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:02.979495    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:02.982358    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:03.035630    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:03.036351    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.036364    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.036371    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.036377    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.417037    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:03.417051    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:03.531673    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.531715    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.531732    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.531747    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.532606    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:03.532629    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:09.110387    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:09.110442    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:09.110451    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:09.135557    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:12.964386    3827 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:05:16.034604    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:16.034620    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034750    3827 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:05:16.034759    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034882    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.034984    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.035084    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035183    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035281    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.035421    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.035570    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.035579    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:05:16.113215    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:05:16.113236    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.113381    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.113518    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113636    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113755    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.113885    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.114075    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.114086    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:16.184090    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:16.184121    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:16.184150    3827 buildroot.go:174] setting up certificates
	I0731 10:05:16.184163    3827 provision.go:84] configureAuth start
	I0731 10:05:16.184170    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.184309    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:16.184430    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.184520    3827 provision.go:143] copyHostCerts
	I0731 10:05:16.184558    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184631    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:16.184638    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184770    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:16.184969    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185016    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:16.185020    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185099    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:16.185248    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185290    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:16.185295    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185376    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:16.185533    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:05:16.315363    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:16.315421    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:16.315435    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.315558    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.315655    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.315746    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.315837    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:16.355172    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:16.355248    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:16.374013    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:16.374082    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:05:16.392556    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:16.392614    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:16.411702    3827 provision.go:87] duration metric: took 227.524882ms to configureAuth
	I0731 10:05:16.411715    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:16.411879    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:16.411893    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:16.412059    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.412155    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.412231    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412316    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412388    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.412496    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.412621    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.412628    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:16.477022    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:16.477033    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:16.477102    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:16.477118    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.477251    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.477356    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477432    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477517    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.477641    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.477778    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.477823    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:16.554633    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:16.554652    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.554788    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.554883    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.554976    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.555060    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.555183    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.555333    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.555346    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:18.220571    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:18.220585    3827 machine.go:97] duration metric: took 15.241941013s to provisionDockerMachine
	I0731 10:05:18.220598    3827 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:05:18.220606    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:18.220616    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.220842    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:18.220863    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.220962    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.221049    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.221130    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.221229    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.266644    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:18.270380    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:18.270395    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:18.270494    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:18.270687    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:18.270693    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:18.270912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:18.279363    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:18.313374    3827 start.go:296] duration metric: took 92.765768ms for postStartSetup
	I0731 10:05:18.313403    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.313592    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:18.313611    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.313704    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.313791    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.313881    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.313968    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.352727    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:18.352783    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:18.406781    3827 fix.go:56] duration metric: took 15.626681307s for fixHost
	I0731 10:05:18.406809    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.406951    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.407051    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407152    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407242    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.407364    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:18.407503    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:18.407510    3827 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0731 10:05:18.475125    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445518.591979627
	
	I0731 10:05:18.475138    3827 fix.go:216] guest clock: 1722445518.591979627
	I0731 10:05:18.475144    3827 fix.go:229] Guest: 2024-07-31 10:05:18.591979627 -0700 PDT Remote: 2024-07-31 10:05:18.406799 -0700 PDT m=+16.073052664 (delta=185.180627ms)
	I0731 10:05:18.475163    3827 fix.go:200] guest clock delta is within tolerance: 185.180627ms
	I0731 10:05:18.475167    3827 start.go:83] releasing machines lock for "ha-393000", held for 15.69510158s
	I0731 10:05:18.475186    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475358    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:18.475493    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475894    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476002    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476070    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:18.476101    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476134    3827 ssh_runner.go:195] Run: cat /version.json
	I0731 10:05:18.476146    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476186    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476210    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476297    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476335    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476385    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476425    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476484    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.476507    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.560719    3827 ssh_runner.go:195] Run: systemctl --version
	I0731 10:05:18.565831    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:05:18.570081    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:18.570125    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:18.582480    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:18.582493    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.582597    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.598651    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:18.607729    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:18.616451    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:18.616493    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:18.625351    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.634238    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:18.643004    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.651930    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:18.660791    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:18.669545    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:18.678319    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:18.687162    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:18.695297    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:18.703279    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:18.796523    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:18.814363    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.814439    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:18.827366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.839312    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:18.855005    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.866218    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.877621    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:18.902460    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.913828    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.928675    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:18.931574    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:18.939501    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:18.952896    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:19.047239    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:19.144409    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:19.144484    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:19.159518    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:19.256187    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:21.607075    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.350869373s)
	I0731 10:05:21.607140    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:21.618076    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:21.632059    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.642878    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:21.739846    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:21.840486    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:21.956403    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:21.971397    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.982152    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.074600    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:22.139737    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:22.139811    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:22.144307    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:22.144354    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:22.147388    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:22.177098    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:22.177167    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.195025    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.255648    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:22.255698    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:22.256066    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:22.260342    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.270020    3827 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:05:22.270145    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:22.270198    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.283427    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.283451    3827 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:05:22.283523    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.296364    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.296384    3827 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:05:22.296395    3827 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:05:22.296485    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:22.296554    3827 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:05:22.333611    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:22.333625    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:22.333642    3827 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:05:22.333657    3827 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:05:22.333735    3827 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:05:22.333754    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:22.333805    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:22.346453    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:22.346520    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:22.346575    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:22.354547    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:22.354585    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:05:22.361938    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:05:22.375252    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:22.388755    3827 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:05:22.402335    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:22.415747    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:22.418701    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.428772    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.517473    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:22.532209    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:05:22.532222    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:22.532233    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:22.532416    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:22.532495    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:22.532505    3827 certs.go:256] generating profile certs ...
	I0731 10:05:22.532617    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:22.532703    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:05:22.532784    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:22.532791    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:22.532813    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:22.532832    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:22.532850    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:22.532866    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:22.532896    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:22.532925    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:22.532949    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:22.533054    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:22.533101    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:22.533110    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:22.533142    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:22.533177    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:22.533206    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:22.533274    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:22.533306    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.533327    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.533344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.533765    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:22.562933    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:22.585645    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:22.608214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:22.634417    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:22.664309    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:22.693214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:22.749172    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:22.798119    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:22.837848    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:22.862351    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:22.887141    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:05:22.900789    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:22.904988    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:22.914154    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917542    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917577    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.921712    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:22.930986    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:22.940208    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943536    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943573    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.947845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:22.957024    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:22.965988    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969319    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969351    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.973794    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:22.982944    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:22.986290    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:22.990544    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:22.994707    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:22.999035    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:23.003364    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:23.007486    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:23.011657    3827 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:23.011769    3827 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:05:23.024287    3827 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:05:23.032627    3827 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:05:23.032639    3827 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:05:23.032681    3827 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:05:23.040731    3827 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:05:23.041056    3827 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.041141    3827 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:05:23.041332    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.041968    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.042168    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:05:23.042482    3827 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:05:23.042638    3827 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:05:23.050561    3827 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:05:23.050575    3827 kubeadm.go:597] duration metric: took 17.931942ms to restartPrimaryControlPlane
	I0731 10:05:23.050580    3827 kubeadm.go:394] duration metric: took 38.928464ms to StartCluster
	I0731 10:05:23.050588    3827 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.050661    3827 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.051035    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.051268    3827 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:23.051280    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:05:23.051290    3827 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:05:23.051393    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.095938    3827 out.go:177] * Enabled addons: 
	I0731 10:05:23.116914    3827 addons.go:510] duration metric: took 65.60253ms for enable addons: enabled=[]
	I0731 10:05:23.116954    3827 start.go:246] waiting for cluster config update ...
	I0731 10:05:23.116965    3827 start.go:255] writing updated cluster config ...
	I0731 10:05:23.138605    3827 out.go:177] 
	I0731 10:05:23.160466    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.160597    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.182983    3827 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:05:23.224869    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:23.224904    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:23.225104    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:23.225125    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:23.225250    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.226256    3827 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:23.226360    3827 start.go:364] duration metric: took 80.549µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:05:23.226385    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:23.226394    3827 fix.go:54] fixHost starting: m02
	I0731 10:05:23.226804    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:23.226838    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:23.236394    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52012
	I0731 10:05:23.236756    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:23.237106    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:23.237125    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:23.237342    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:23.237473    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.237574    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:05:23.237669    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.237738    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:05:23.238671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.238732    3827 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:05:23.238750    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:05:23.238834    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:23.260015    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:05:23.302032    3827 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:05:23.302368    3827 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:05:23.302393    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.304220    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.304235    3827 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3703 is in state "Stopped"
	I0731 10:05:23.304257    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:05:23.304590    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:05:23.331752    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:05:23.331774    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:23.331901    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331928    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331992    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:23.332030    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:23.332051    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:23.333566    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Pid is 3849
	I0731 10:05:23.333951    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:05:23.333966    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.334032    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3849
	I0731 10:05:23.335680    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:05:23.335745    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:23.335779    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:05:23.335790    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbf52}
	I0731 10:05:23.335796    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:05:23.335803    3827 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:05:23.335842    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:05:23.336526    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:23.336703    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.337199    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:23.337210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.337324    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:23.337431    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:23.337536    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337761    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:23.337898    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:23.338051    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:23.338058    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:23.341501    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:23.350236    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:23.351301    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.351321    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.351333    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.351364    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.736116    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:23.736132    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:23.851173    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.851191    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.851204    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.851217    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.852083    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:23.852399    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:29.408102    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:29.408171    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:29.408180    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:29.431671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:34.400446    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:34.400461    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400584    3827 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:05:34.400595    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400705    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.400796    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.400890    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.400963    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.401039    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.401181    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.401327    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.401336    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:05:34.470038    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:05:34.470053    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.470199    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.470327    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470407    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470489    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.470615    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.470762    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.470773    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:34.535872    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:34.535890    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:34.535899    3827 buildroot.go:174] setting up certificates
	I0731 10:05:34.535905    3827 provision.go:84] configureAuth start
	I0731 10:05:34.535911    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.536042    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:34.536141    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.536239    3827 provision.go:143] copyHostCerts
	I0731 10:05:34.536274    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536323    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:34.536328    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536441    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:34.536669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536701    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:34.536706    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536812    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:34.536958    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.536987    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:34.536992    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.537061    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:34.537222    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:05:34.648982    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:34.649040    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:34.649057    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.649198    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.649295    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.649402    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.649489    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:34.683701    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:34.683772    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:34.703525    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:34.703596    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:34.722548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:34.722624    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:05:34.742309    3827 provision.go:87] duration metric: took 206.391288ms to configureAuth
	I0731 10:05:34.742322    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:34.742483    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:34.742496    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:34.742630    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.742723    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.742814    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742903    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742982    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.743099    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.743260    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.743269    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:34.800092    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:34.800106    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:34.800191    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:34.800203    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.800330    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.800415    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800506    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800591    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.800702    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.800838    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.800885    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:34.869190    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:34.869210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.869342    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.869439    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869544    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869626    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.869780    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.869920    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.869935    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:36.520454    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:36.520469    3827 machine.go:97] duration metric: took 13.183263325s to provisionDockerMachine
	I0731 10:05:36.520479    3827 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:05:36.520499    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:36.520508    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.520691    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:36.520702    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.520789    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.520884    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.520979    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.521066    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.561300    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:36.564926    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:36.564938    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:36.565027    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:36.565170    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:36.565176    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:36.565342    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:36.574123    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:36.603284    3827 start.go:296] duration metric: took 82.788869ms for postStartSetup
	I0731 10:05:36.603307    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.603494    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:36.603509    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.603613    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.603706    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.603803    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.603903    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.639240    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:36.639297    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:36.692559    3827 fix.go:56] duration metric: took 13.466165097s for fixHost
	I0731 10:05:36.692585    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.692728    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.692817    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692901    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692991    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.693111    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:36.693255    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:36.693263    3827 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0731 10:05:36.752606    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445536.868457526
	
	I0731 10:05:36.752619    3827 fix.go:216] guest clock: 1722445536.868457526
	I0731 10:05:36.752626    3827 fix.go:229] Guest: 2024-07-31 10:05:36.868457526 -0700 PDT Remote: 2024-07-31 10:05:36.692574 -0700 PDT m=+34.358830009 (delta=175.883526ms)
	I0731 10:05:36.752636    3827 fix.go:200] guest clock delta is within tolerance: 175.883526ms
	I0731 10:05:36.752640    3827 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.526270601s
	I0731 10:05:36.752657    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.752793    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:36.777379    3827 out.go:177] * Found network options:
	I0731 10:05:36.798039    3827 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:05:36.819503    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.819540    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820385    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820770    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:36.820818    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:05:36.820878    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.820996    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:05:36.821009    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821024    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.821247    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821250    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821474    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821525    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821664    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.821739    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821918    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:05:36.854335    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:36.854406    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:36.901302    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:36.901324    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:36.901422    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:36.917770    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:36.926621    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:36.935218    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:36.935259    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:36.943879    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.952873    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:36.961710    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.970281    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:36.979176    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:36.987922    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:36.996548    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:37.005349    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:37.013281    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:37.020977    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.118458    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:37.137862    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:37.137937    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:37.153588    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.167668    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:37.181903    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.192106    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.202268    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:37.223314    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.233629    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:37.248658    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:37.251547    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:37.258758    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:37.272146    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:37.371218    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:37.472623    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:37.472648    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:37.486639    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.587113    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:39.947283    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.360151257s)
	I0731 10:05:39.947347    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:39.958391    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:39.972060    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:39.983040    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:40.085475    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:40.202062    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.302654    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:40.316209    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:40.326252    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.418074    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:40.482758    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:40.482836    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:40.487561    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:40.487613    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:40.491035    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:40.518347    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:40.518420    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.537051    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.576384    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:40.597853    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:05:40.618716    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:40.618993    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:40.622501    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:40.631917    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:05:40.632085    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:40.632302    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.632324    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.640887    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52034
	I0731 10:05:40.641227    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.641546    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.641557    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.641784    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.641900    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:40.641993    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:40.642069    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:40.643035    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:05:40.643318    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.643340    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.651868    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52036
	I0731 10:05:40.652209    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.652562    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.652581    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.652781    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.652890    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:40.652982    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 10:05:40.652988    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:40.653003    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:40.653135    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:40.653190    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:40.653199    3827 certs.go:256] generating profile certs ...
	I0731 10:05:40.653301    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:40.653388    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.59c17652
	I0731 10:05:40.653436    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:40.653443    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:40.653468    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:40.653489    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:40.653510    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:40.653529    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:40.653548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:40.653566    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:40.653584    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:40.653667    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:40.653713    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:40.653722    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:40.653755    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:40.653790    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:40.653819    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:40.653897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:40.653931    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:40.653957    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:40.653976    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:40.654001    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:40.654103    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:40.654205    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:40.654295    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:40.654382    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:40.686134    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0731 10:05:40.689771    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:05:40.697866    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0731 10:05:40.700957    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:05:40.708798    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:05:40.711973    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:05:40.719794    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:05:40.722937    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:05:40.731558    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:05:40.734708    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:05:40.742535    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0731 10:05:40.745692    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:05:40.753969    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:40.774721    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:40.793621    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:40.813481    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:40.833191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:40.853099    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:40.872942    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:40.892952    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:40.912690    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:40.932438    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:40.952459    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:40.971059    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:05:40.984708    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:05:40.998235    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:05:41.011745    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:05:41.025144    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:05:41.038794    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:05:41.052449    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:05:41.066415    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:41.070679    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:41.078894    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082206    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082237    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.086362    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:41.094634    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:41.103040    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106511    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106559    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.110939    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:41.119202    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:41.127421    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130783    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.134845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:41.142958    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:41.146291    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:41.150662    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:41.154843    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:41.159061    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:41.163240    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:41.167541    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:41.171729    3827 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 10:05:41.171784    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:41.171806    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:41.171838    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:41.184093    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:41.184125    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:41.184181    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:41.191780    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:41.191825    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:05:41.199155    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:05:41.212419    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:41.225964    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:41.239859    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:41.242661    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:41.251855    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.345266    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.360525    3827 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:41.360751    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:41.382214    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:05:41.402932    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.525126    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.539502    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:41.539699    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:05:41.539742    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:05:41.539934    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:41.540009    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:41.540015    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:41.540022    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:41.540026    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.017427    3827 round_trippers.go:574] Response Status: 200 OK in 8477 milliseconds
	I0731 10:05:50.018648    3827 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 10:05:50.018662    3827 node_ready.go:38] duration metric: took 8.478709659s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:50.018668    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:05:50.018717    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:05:50.018723    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.018731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.018737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.028704    3827 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 10:05:50.043501    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.043562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:05:50.043568    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.043574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.043579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.049258    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.050015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.050025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.050031    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.050035    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.066794    3827 round_trippers.go:574] Response Status: 200 OK in 16 milliseconds
	I0731 10:05:50.067093    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.067103    3827 pod_ready.go:81] duration metric: took 23.584491ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067110    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067150    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:05:50.067155    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.067161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.067170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.072229    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.072653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.072662    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.072674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.072678    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076158    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.076475    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.076487    3827 pod_ready.go:81] duration metric: took 9.372147ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076494    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076536    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:05:50.076541    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.076547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076551    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079467    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.079849    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.079858    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.079866    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079871    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.086323    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:05:50.086764    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.086775    3827 pod_ready.go:81] duration metric: took 10.276448ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086782    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:05:50.086846    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.086852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.086861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.090747    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.091293    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:50.091301    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.091306    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.091310    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.093538    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.094155    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.094165    3827 pod_ready.go:81] duration metric: took 7.376399ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094171    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:05:50.094214    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.094220    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.094223    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.096892    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.219826    3827 request.go:629] Waited for 122.388601ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219867    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219876    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.219882    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.219887    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.222303    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.222701    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.222710    3827 pod_ready.go:81] duration metric: took 128.533092ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.222720    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.419341    3827 request.go:629] Waited for 196.517978ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419372    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419376    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.419382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.419386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.424561    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.619242    3827 request.go:629] Waited for 194.143472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619333    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619339    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.619346    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.619350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.622245    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.622550    3827 pod_ready.go:97] node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622563    3827 pod_ready.go:81] duration metric: took 399.836525ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	E0731 10:05:50.622570    3827 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622575    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.819353    3827 request.go:629] Waited for 196.739442ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819433    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.819438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.819447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.822809    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:51.019387    3827 request.go:629] Waited for 196.0195ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019480    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.019488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.019494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.021643    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.220184    3827 request.go:629] Waited for 96.247837ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220254    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220260    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.220266    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.220271    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.222468    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.419702    3827 request.go:629] Waited for 196.732028ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419735    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419739    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.419746    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.419749    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.422018    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.622851    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.622865    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.622870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.622873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.625570    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.818923    3827 request.go:629] Waited for 192.647007ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818971    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.818977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.818981    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.821253    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.123108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.123124    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.123133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.123137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.125336    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.220188    3827 request.go:629] Waited for 94.282602ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220295    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220306    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.220317    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.220325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.223136    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.623123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.623202    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.623217    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.623227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.626259    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:52.626893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.626903    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.626912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.626916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.628416    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:52.628799    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:53.124413    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.124432    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.124441    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.124446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.127045    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.127494    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.127501    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.127511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.127514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.129223    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:53.623065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.623121    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.623133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.623142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626047    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.626707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.626717    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.626725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626729    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.628447    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:54.123646    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.123761    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.123778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.123788    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.127286    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:54.128015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.128025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.128033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.128038    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.130101    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.623229    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.623244    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.623253    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.623266    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.625325    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.625780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.625788    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.625794    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.625798    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.627218    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.123298    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.123318    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.123329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.123334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.126495    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:55.127199    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.127207    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.127213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.127217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.128585    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.128968    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:55.623994    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.624008    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.624016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.624021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.626813    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:55.627329    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.627336    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.627342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.627345    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.628805    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.123118    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.123195    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.123210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.123231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.126276    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:56.126864    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.126872    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.126877    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.126881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.128479    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.623814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.623924    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.623942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.623953    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.626841    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:56.627450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.627457    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.627463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.627467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.628844    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.124173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.124250    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.124262    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.124287    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.127734    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:57.128370    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.128377    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.128383    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.128386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.130108    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.130481    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:57.624004    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.624033    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.624093    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.624103    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.627095    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:57.628522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.628533    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.628541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.628547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.630446    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.123493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.123505    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.123512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.123514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.125506    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.126108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.126116    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.126121    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.126124    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.127991    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.623114    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.623141    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.623216    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.626428    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:58.627173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.627181    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.627187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.627191    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.628749    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.123212    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.123231    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.123243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.123249    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.126584    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:59.127100    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.127110    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.127118    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.127123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.129080    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.624707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.624736    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.624808    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.624814    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.627710    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:59.628543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.628550    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.628556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.628560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.630077    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.630437    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:00.123863    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.123878    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.123885    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.123888    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.125761    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.126237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.126245    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.126251    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.126254    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.127937    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.623226    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.623240    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.623246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.623249    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625210    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.625691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.625699    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.625704    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.627280    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.124705    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.124804    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.124820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.124830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.127445    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.127933    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.127941    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.127947    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.127950    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.129462    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.623718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.623731    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.623736    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.623739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.625948    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.626336    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.626344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.626349    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.626352    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.627901    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.124021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.124081    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.124088    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.124092    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.125801    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.126187    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.126195    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.126200    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.126204    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.127656    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.127974    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:02.623206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.623222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.623232    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.626774    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:02.627381    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.627389    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.627395    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.627400    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.630037    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.122889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.122980    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.122991    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.122997    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.125539    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.125964    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.125972    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.125976    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.125991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.129847    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.623340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.623368    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.623379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.623386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.626892    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.627517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.627524    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.627530    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.627532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.629281    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.123967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:04.124007    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.124016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.124021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.126604    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.127104    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.127111    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.127116    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.127131    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.128806    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.129260    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.129268    3827 pod_ready.go:81] duration metric: took 13.506690115s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129277    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129312    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:04.129317    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.129323    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.129328    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.131506    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.131966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.131974    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.131980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.131984    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.133464    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.133963    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.133974    3827 pod_ready.go:81] duration metric: took 4.690553ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.133981    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.134013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:04.134018    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.134023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.134028    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.136093    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.136498    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:04.136506    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.136512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.136515    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.138480    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.138864    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.138874    3827 pod_ready.go:81] duration metric: took 4.887644ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138882    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138917    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:04.138922    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.138928    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.138932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.140760    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.141121    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.141129    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.141134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.141137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.143127    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.143455    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.143464    3827 pod_ready.go:81] duration metric: took 4.577275ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143471    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:04.143513    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.143519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.143523    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.145638    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.145987    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.145994    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.146000    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.146003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.147718    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.148046    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.148055    3827 pod_ready.go:81] duration metric: took 4.578507ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.148061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.325414    3827 request.go:629] Waited for 177.298505ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325544    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.325555    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.325563    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.328825    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.525753    3827 request.go:629] Waited for 196.338568ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.525828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.525836    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.529114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.529604    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.529616    3827 pod_ready.go:81] duration metric: took 381.550005ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.529625    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.724886    3827 request.go:629] Waited for 195.165832ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724931    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.724937    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.724942    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.726934    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.924942    3827 request.go:629] Waited for 197.623557ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924972    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924977    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.924984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.924987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.927056    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.927556    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.927566    3827 pod_ready.go:81] duration metric: took 397.934888ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.927572    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.124719    3827 request.go:629] Waited for 197.081968ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124767    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.124774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.124777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.126705    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.324036    3827 request.go:629] Waited for 196.854241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324136    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.324144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.324151    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.326450    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:05.326831    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.326840    3827 pod_ready.go:81] duration metric: took 399.263993ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.326854    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.525444    3827 request.go:629] Waited for 198.543186ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525484    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.525490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.525494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.527459    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.724382    3827 request.go:629] Waited for 196.465154ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724505    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.724516    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.724528    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.727650    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:05.728134    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.728147    3827 pod_ready.go:81] duration metric: took 401.285988ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.728155    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.925067    3827 request.go:629] Waited for 196.808438ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.925137    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.925147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.928198    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.125772    3827 request.go:629] Waited for 196.79397ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125895    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125907    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.125918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.125924    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.129114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.129535    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.129548    3827 pod_ready.go:81] duration metric: took 401.386083ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.129557    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.324601    3827 request.go:629] Waited for 194.995432ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.324729    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.324736    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.327699    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.524056    3827 request.go:629] Waited for 195.918056ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524164    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524175    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.524186    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.524192    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.527800    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.528245    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.528255    3827 pod_ready.go:81] duration metric: took 398.692914ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.528262    3827 pod_ready.go:38] duration metric: took 16.509588377s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:06.528282    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:06.528341    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:06.541572    3827 api_server.go:72] duration metric: took 25.181024878s to wait for apiserver process to appear ...
	I0731 10:06:06.541584    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:06.541605    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:06.544968    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:06.545011    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:06.545016    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.545023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.545027    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.545730    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:06.545799    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:06.545808    3827 api_server.go:131] duration metric: took 4.219553ms to wait for apiserver health ...
	I0731 10:06:06.545813    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:06.724899    3827 request.go:629] Waited for 179.053526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724936    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.724948    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.724951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.733411    3827 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 10:06:06.742910    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:06.742937    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:06.742945    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:06.742950    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:06.742953    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:06.742958    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:06.742961    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:06.742963    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:06.742966    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:06.742968    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:06.742971    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:06.742973    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:06.742977    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:06.742981    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:06.742984    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:06.742986    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:06.742989    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:06.742991    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:06.742995    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:06.742998    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:06.743001    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:06.743003    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Pending
	I0731 10:06:06.743006    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:06.743010    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:06.743012    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:06.743017    3827 system_pods.go:74] duration metric: took 197.200154ms to wait for pod list to return data ...
	I0731 10:06:06.743023    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:06.925020    3827 request.go:629] Waited for 181.949734ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925067    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.925076    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.925081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.927535    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.927730    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:06.927740    3827 default_sa.go:55] duration metric: took 184.712762ms for default service account to be created ...
	I0731 10:06:06.927745    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:07.125051    3827 request.go:629] Waited for 197.272072ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125090    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.125100    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.125104    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.129975    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:07.134630    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:07.134648    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134654    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134659    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:07.134663    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:07.134666    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:07.134671    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running / Ready:ContainersNotReady (containers with unready status: [kindnet-cni]) / ContainersReady:ContainersNotReady (containers with unready status: [kindnet-cni])
	I0731 10:06:07.134675    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:07.134679    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:07.134683    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:07.134705    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:07.134712    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:07.134718    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0731 10:06:07.134723    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:07.134728    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:07.134731    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:07.134735    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:07.134739    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0731 10:06:07.134743    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:07.134747    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:07.134751    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:07.134755    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:07.134764    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:07.134768    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:07.134772    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0731 10:06:07.134781    3827 system_pods.go:126] duration metric: took 207.030567ms to wait for k8s-apps to be running ...
	I0731 10:06:07.134786    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:07.134841    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:07.148198    3827 system_svc.go:56] duration metric: took 13.406485ms WaitForService to wait for kubelet
	I0731 10:06:07.148215    3827 kubeadm.go:582] duration metric: took 25.78766951s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:07.148230    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:07.324197    3827 request.go:629] Waited for 175.905806ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324232    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.324238    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.324243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.329946    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:07.330815    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330830    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330840    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330843    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330847    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330850    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330853    3827 node_conditions.go:105] duration metric: took 182.619551ms to run NodePressure ...
	I0731 10:06:07.330860    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:07.330878    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:07.352309    3827 out.go:177] 
	I0731 10:06:07.373528    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:07.373631    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.433500    3827 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 10:06:07.475236    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:07.475262    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:07.475398    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:07.475412    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:07.475498    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.476024    3827 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:07.476077    3827 start.go:364] duration metric: took 40.57µs to acquireMachinesLock for "ha-393000-m03"
	I0731 10:06:07.476090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:07.476095    3827 fix.go:54] fixHost starting: m03
	I0731 10:06:07.476337    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:07.476357    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:07.485700    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52041
	I0731 10:06:07.486069    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:07.486427    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:07.486449    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:07.486677    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:07.486797    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.486888    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:06:07.486969    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.487057    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:06:07.488010    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.488031    3827 fix.go:112] recreateIfNeeded on ha-393000-m03: state=Stopped err=<nil>
	I0731 10:06:07.488039    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	W0731 10:06:07.488129    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:07.525270    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m03" ...
	I0731 10:06:07.583189    3827 main.go:141] libmachine: (ha-393000-m03) Calling .Start
	I0731 10:06:07.583357    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.583398    3827 main.go:141] libmachine: (ha-393000-m03) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 10:06:07.584444    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.584457    3827 main.go:141] libmachine: (ha-393000-m03) DBG | pid 2994 is in state "Stopped"
	I0731 10:06:07.584473    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid...
	I0731 10:06:07.584622    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 10:06:07.614491    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 10:06:07.614519    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:07.614662    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614709    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614792    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:07.614841    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:07.614865    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:07.616508    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Pid is 3858
	I0731 10:06:07.617000    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 10:06:07.617017    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.617185    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 3858
	I0731 10:06:07.619558    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 10:06:07.619621    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:07.619647    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:07.619664    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:07.619685    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:07.619703    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:06:07.619712    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 10:06:07.619727    3827 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 10:06:07.619755    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 10:06:07.620809    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:07.621055    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.621590    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:07.621602    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.621745    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:07.621861    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:07.621957    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622061    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622150    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:07.622290    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:07.622460    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:07.622469    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:07.625744    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:07.635188    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:07.636453    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:07.636476    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:07.636488    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:07.636503    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.026194    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:08.026210    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:08.141380    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:08.141403    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:08.141420    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:08.141430    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.142228    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:08.142237    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:13.717443    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:13.717596    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:13.717612    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:13.741129    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:06:18.682578    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:06:18.682599    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682767    3827 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 10:06:18.682779    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682866    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.682981    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.683070    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683166    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683267    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.683412    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.683571    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.683581    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 10:06:18.749045    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 10:06:18.749064    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.749190    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.749278    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749369    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.749565    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.749706    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.749722    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:06:18.806865    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:06:18.806883    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:06:18.806892    3827 buildroot.go:174] setting up certificates
	I0731 10:06:18.806898    3827 provision.go:84] configureAuth start
	I0731 10:06:18.806904    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.807035    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:18.807129    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.807209    3827 provision.go:143] copyHostCerts
	I0731 10:06:18.807236    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807287    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:06:18.807293    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807440    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:06:18.807654    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807687    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:06:18.807691    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807798    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:06:18.807946    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.807978    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:06:18.807983    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.808051    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:06:18.808199    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 10:06:18.849388    3827 provision.go:177] copyRemoteCerts
	I0731 10:06:18.849440    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:06:18.849454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.849608    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.849706    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.849793    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.849878    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:18.882927    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:06:18.883001    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:06:18.902836    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:06:18.902904    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:06:18.922711    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:06:18.922778    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:06:18.943709    3827 provision.go:87] duration metric: took 136.803232ms to configureAuth
	I0731 10:06:18.943724    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:06:18.943896    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:18.943910    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:18.944075    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.944168    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.944245    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944342    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944422    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.944538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.944665    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.944672    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:06:18.996744    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:06:18.996756    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:06:18.996829    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:06:18.996840    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.996972    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.997082    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997171    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997252    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.997394    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.997538    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.997587    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:06:19.061774    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:06:19.061792    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:19.061924    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:19.062001    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062094    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062183    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:19.062322    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:19.062475    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:19.062487    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:06:20.667693    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:06:20.667709    3827 machine.go:97] duration metric: took 13.046112735s to provisionDockerMachine
	I0731 10:06:20.667718    3827 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 10:06:20.667725    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:06:20.667738    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.667939    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:06:20.667954    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.668063    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.668167    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.668260    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.668365    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.711043    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:06:20.714520    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:06:20.714533    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:06:20.714632    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:06:20.714782    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:06:20.714789    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:06:20.714971    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:06:20.725237    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:20.756197    3827 start.go:296] duration metric: took 88.463878ms for postStartSetup
	I0731 10:06:20.756221    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.756402    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:06:20.756417    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.756509    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.756594    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.756688    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.756757    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.788829    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:06:20.788889    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:06:20.841715    3827 fix.go:56] duration metric: took 13.365618842s for fixHost
	I0731 10:06:20.841743    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.841878    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.841982    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842069    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842155    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.842314    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:20.842486    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:20.842494    3827 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0731 10:06:20.895743    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445580.896263750
	
	I0731 10:06:20.895763    3827 fix.go:216] guest clock: 1722445580.896263750
	I0731 10:06:20.895768    3827 fix.go:229] Guest: 2024-07-31 10:06:20.89626375 -0700 PDT Remote: 2024-07-31 10:06:20.841731 -0700 PDT m=+78.507993684 (delta=54.53275ms)
	I0731 10:06:20.895779    3827 fix.go:200] guest clock delta is within tolerance: 54.53275ms
	I0731 10:06:20.895783    3827 start.go:83] releasing machines lock for "ha-393000-m03", held for 13.419701289s
	I0731 10:06:20.895800    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.895930    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:20.933794    3827 out.go:177] * Found network options:
	I0731 10:06:21.008361    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 10:06:21.029193    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.029220    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.029239    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.029902    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030149    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030274    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:06:21.030303    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 10:06:21.030372    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.030402    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.030458    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030487    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:06:21.030508    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:21.030615    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030657    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030724    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030782    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030837    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030887    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:21.030941    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 10:06:21.060481    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:06:21.060548    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:06:21.113024    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:06:21.113039    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.113103    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.128523    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:06:21.136837    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:06:21.145325    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.145388    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:06:21.153686    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.162021    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:06:21.170104    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.178345    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:06:21.186720    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:06:21.195003    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:06:21.203212    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:06:21.211700    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:06:21.219303    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:06:21.226730    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.333036    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:06:21.355400    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.355468    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:06:21.370793    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.382599    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:06:21.397116    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.408366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.419500    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:06:21.441593    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.453210    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.468638    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:06:21.471686    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:06:21.480107    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:06:21.493473    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:06:21.590098    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:06:21.695002    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.695025    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:06:21.709644    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.804799    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:06:24.090859    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.286034061s)
	I0731 10:06:24.090921    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:06:24.102085    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:06:24.115631    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.125950    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:06:24.222193    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:06:24.332843    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.449689    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:06:24.463232    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.474652    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.567486    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:06:24.631150    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:06:24.631230    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:06:24.635708    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:06:24.635764    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:06:24.638929    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:06:24.666470    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:06:24.666542    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.686587    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.729344    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:06:24.771251    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:06:24.792172    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:06:24.813314    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:24.813703    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:06:24.818215    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:24.828147    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:06:24.828324    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:24.828531    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.828552    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.837259    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52063
	I0731 10:06:24.837609    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.837954    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.837967    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.838165    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.838272    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:06:24.838349    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:24.838424    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:06:24.839404    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:06:24.839647    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.839672    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.848293    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52065
	I0731 10:06:24.848630    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.848982    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.848999    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.849191    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.849297    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:06:24.849393    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 10:06:24.849399    3827 certs.go:194] generating shared ca certs ...
	I0731 10:06:24.849408    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:06:24.849551    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:06:24.849606    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:06:24.849615    3827 certs.go:256] generating profile certs ...
	I0731 10:06:24.849710    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:06:24.849799    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 10:06:24.849848    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:06:24.849860    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:06:24.849881    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:06:24.849901    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:06:24.849920    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:06:24.849937    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:06:24.849955    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:06:24.849974    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:06:24.849991    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:06:24.850072    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:06:24.850109    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:06:24.850118    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:06:24.850152    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:06:24.850184    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:06:24.850218    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:06:24.850285    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:24.850322    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:06:24.850344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:06:24.850366    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:24.850395    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:06:24.850485    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:06:24.850565    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:06:24.850653    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:06:24.850732    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:06:24.882529    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0731 10:06:24.886785    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:06:24.896598    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0731 10:06:24.900384    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:06:24.910269    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:06:24.914011    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:06:24.922532    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:06:24.925784    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:06:24.936850    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:06:24.940321    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:06:24.950026    3827 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0731 10:06:24.953055    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:06:24.962295    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:06:24.982990    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:06:25.003016    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:06:25.022822    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:06:25.043864    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:06:25.064140    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:06:25.084546    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:06:25.105394    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:06:25.125890    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:06:25.146532    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:06:25.166742    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:06:25.186545    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:06:25.200206    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:06:25.214106    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:06:25.228037    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:06:25.242065    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:06:25.255847    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:06:25.269574    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:06:25.283881    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:06:25.288466    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:06:25.297630    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301289    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301331    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.305714    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:06:25.314348    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:06:25.322967    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326578    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326634    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.330926    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:06:25.339498    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:06:25.348151    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351535    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351576    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.355921    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:06:25.364535    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:06:25.368077    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:06:25.372428    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:06:25.376757    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:06:25.380980    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:06:25.385296    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:06:25.389606    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:06:25.393857    3827 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 10:06:25.393914    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:06:25.393928    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:06:25.393959    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:06:25.405786    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:06:25.405830    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:06:25.405888    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:06:25.414334    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:06:25.414379    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:06:25.422310    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:06:25.435970    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:06:25.449652    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:06:25.463392    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:06:25.466266    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:25.476391    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.572265    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.587266    3827 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:06:25.587454    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:25.609105    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:06:25.650600    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.776520    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.790838    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:06:25.791048    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:06:25.791095    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:06:25.791257    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.791299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:25.791305    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.791311    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.791315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.793351    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.793683    3827 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 10:06:25.793693    3827 node_ready.go:38] duration metric: took 2.426331ms for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.793700    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:25.793737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:25.793742    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.793753    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.793758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.797877    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:25.803934    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:25.803995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:25.804000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.804007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.804011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.806477    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.806997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:25.807005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.807011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.807014    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.808989    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.304983    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.304998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.305006    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.305010    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.307209    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:26.307839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.307846    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.307852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.307861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.309644    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.805493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.805510    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.805520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.805527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.821394    3827 round_trippers.go:574] Response Status: 200 OK in 15 milliseconds
	I0731 10:06:26.822205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.822215    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.822221    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.822224    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.827160    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:27.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.305839    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.305846    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308258    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.308744    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.308752    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.308758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308761    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.310974    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.805552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.805567    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.805574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.805578    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.807860    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.808403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.808410    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.808416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.808419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.810436    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.810811    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:28.305577    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.305593    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.305600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.305604    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.311583    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:28.312446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.312455    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.312461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.312465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.314779    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.804391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.804407    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.804414    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.804420    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.806848    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.807227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.807235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.807241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.807244    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.809171    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:29.305552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.305615    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.305624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.305629    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.308134    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.308891    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.308900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.308906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.308909    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.311098    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.805109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.805127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.805192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.805198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.807898    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.808285    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.808292    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.808297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.808300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.810154    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.305017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.305032    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.305045    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.305048    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.307205    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.307776    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.307783    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.307789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.307792    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.309771    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.310293    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:30.805366    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.805428    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.805436    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.805440    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.807864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.808309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.808316    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.808322    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.808325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.810111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.305667    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.305700    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.305708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.305712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308126    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:31.308539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.308546    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.308552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.310279    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.804975    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.805002    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.805014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.805020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.808534    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:31.809053    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.809061    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.809066    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.809069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.810955    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.304759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.304815    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.304830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.304839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.308267    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.308684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.308692    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.308698    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.308701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.310475    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.310804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:32.805138    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.805163    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.805175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.805181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.808419    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.809125    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.809133    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.809139    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.809143    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.810741    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.305088    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.305103    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.305109    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.305113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.307495    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.307998    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.308005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.308011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.308015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.309595    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.806000    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.806021    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.806049    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.806056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.808625    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.809248    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.809259    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.809264    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.809269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.810758    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.305752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.305832    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.305847    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.305853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.308868    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.309591    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.309599    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.309605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.309608    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.311263    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.311627    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:34.804923    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.804948    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.804959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.804965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.808036    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.808636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.808646    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.808654    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.808670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.810398    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.305879    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.305966    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.305982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.305991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.309016    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:35.309584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.309592    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.309598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.309601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.311155    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.804092    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.804107    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.804114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.804117    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.806476    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:35.806988    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.806997    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.807002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.807025    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.808897    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.305921    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.305943    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.305951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.305955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.308670    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:36.309170    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.309178    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.309184    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.309199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.310943    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.805015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.805085    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.805098    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.805106    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.808215    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:36.808810    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.808817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.808823    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.808827    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.810482    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.810768    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:37.305031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.305055    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.305068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.305077    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.308209    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:37.308934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.308942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.308947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.308951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.310514    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:37.805625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.805671    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.805682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.805687    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808188    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:37.808728    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.808735    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.808741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808744    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.810288    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.305838    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.305845    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.307926    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.308378    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.308386    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.308391    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.308395    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.310092    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.805380    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.805397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.805406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.805410    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.807819    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.808368    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.808376    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.808382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.808385    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.809904    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.305804    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.305820    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.305826    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.305830    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.307991    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.308527    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.308535    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.308541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.308546    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.310495    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.310929    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:39.806108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.806122    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.806129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.806132    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.808192    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.808709    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.808718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.808727    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.808730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.810476    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.304101    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.304125    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.304137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.304144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307004    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.307629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.307637    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.307643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.309373    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.804289    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.804302    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.804329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.804334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.806678    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.807320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.807328    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.807334    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.807338    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.809111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.305710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.305762    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.305770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.305774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.307795    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.308244    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.308252    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.308258    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.310033    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.805219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.805235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.805242    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.805246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.807574    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.808103    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.808112    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.808119    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.808123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.810305    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.810720    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:42.305509    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.305569    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.305580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.305586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.307774    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:42.308154    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.308161    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.308167    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.308170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.309895    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:42.804631    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.804655    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.804667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.804687    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.808080    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:42.808852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.808863    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.808869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.808874    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.811059    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.304116    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.304217    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.304233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.304239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.306879    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.307340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.307348    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.307354    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.307358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.308948    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.805920    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.805934    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.805981    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.805986    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.808009    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.808576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.808583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.808589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.808592    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.810282    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.810804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:44.304703    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.304728    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.304798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.304823    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.308376    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.308780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.308787    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.308793    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.308797    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.310396    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:44.805218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.805242    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.805255    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.805264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.808404    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.808967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.808978    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.808986    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.808990    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.810748    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.304672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.304770    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.304784    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.304791    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.307754    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:45.308249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.308256    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.308265    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.309903    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.804236    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.804265    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.804276    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.804281    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.807605    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:45.808214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.808222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.808228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.808231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.810076    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.305660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.305674    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.305723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.305727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.307959    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.308389    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.308397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.308403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.308406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.310188    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.310668    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:46.805585    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.805685    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.805700    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.805708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.808399    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.808892    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.808900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.808910    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.808914    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.810397    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.304911    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:47.304926    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.304933    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.304936    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.307282    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.307761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.307768    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.307774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.307777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.309541    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.309921    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.309931    3827 pod_ready.go:81] duration metric: took 21.505983976s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309937    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:06:47.309971    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.309977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.309980    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.311547    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.311995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.312003    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.312009    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.312013    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.313414    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.313802    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.313811    3827 pod_ready.go:81] duration metric: took 3.869093ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313818    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313850    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:06:47.313855    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.313861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.313865    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.315523    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.315938    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.315947    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.315955    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.315959    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.317522    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.317922    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.317931    3827 pod_ready.go:81] duration metric: took 4.10711ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317937    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:06:47.317976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.317982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.317985    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319520    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.319893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:47.319900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.319906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319909    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321439    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.321816    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.321825    3827 pod_ready.go:81] duration metric: took 3.88293ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321832    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321862    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:06:47.321867    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.321872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321876    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.323407    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.323756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:47.323763    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.323769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.323773    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.325384    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.325703    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.325712    3827 pod_ready.go:81] duration metric: took 3.875112ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.325727    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.505410    3827 request.go:629] Waited for 179.649549ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505454    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.505462    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.505467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.508003    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.705861    3827 request.go:629] Waited for 197.38651ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.705987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.705997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.708863    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.709477    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.709486    3827 pod_ready.go:81] duration metric: took 383.754198ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.709493    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.905743    3827 request.go:629] Waited for 196.205437ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905783    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905790    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.905812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.905826    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.908144    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.106945    3827 request.go:629] Waited for 198.217758ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106991    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.107017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.107023    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.109503    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.109889    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.109898    3827 pod_ready.go:81] duration metric: took 400.399458ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.109910    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.306502    3827 request.go:629] Waited for 196.553294ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.306589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.306593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.308907    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.506077    3827 request.go:629] Waited for 196.82354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506180    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.506189    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.506195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.508341    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.508805    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.508814    3827 pod_ready.go:81] duration metric: took 398.898513ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.508829    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.706656    3827 request.go:629] Waited for 197.780207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706753    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706765    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.706776    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.706784    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.709960    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.906621    3827 request.go:629] Waited for 195.987746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906714    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906726    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.906737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.906744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.910100    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.910537    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.910550    3827 pod_ready.go:81] duration metric: took 401.715473ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.910559    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.106125    3827 request.go:629] Waited for 195.518023ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106250    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106262    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.106273    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.106280    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.109411    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.306599    3827 request.go:629] Waited for 196.360989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306730    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.306741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.306747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.309953    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.310311    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.310320    3827 pod_ready.go:81] duration metric: took 399.753992ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.310327    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.505092    3827 request.go:629] Waited for 194.718659ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505129    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.505140    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.505144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.510347    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:49.706499    3827 request.go:629] Waited for 195.722594ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706547    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706556    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.706623    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.706634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.709639    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:49.710039    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.710049    3827 pod_ready.go:81] duration metric: took 399.716837ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.710061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.906378    3827 request.go:629] Waited for 196.280735ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906418    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.906425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.906442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.911634    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:50.106586    3827 request.go:629] Waited for 194.536585ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106637    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106652    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.106717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.106725    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.109661    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.110176    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.110189    3827 pod_ready.go:81] duration metric: took 400.121095ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.110197    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.306216    3827 request.go:629] Waited for 195.968962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306286    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.306291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.306301    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.308314    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.505180    3827 request.go:629] Waited for 196.336434ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505332    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.505344    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.505351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.508601    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.509059    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.509072    3827 pod_ready.go:81] duration metric: took 398.868353ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.509081    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.705014    3827 request.go:629] Waited for 195.886159ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.705144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.705151    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.708274    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.906912    3827 request.go:629] Waited for 198.179332ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906985    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906991    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.906997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.907002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.908938    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:50.909509    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.909519    3827 pod_ready.go:81] duration metric: took 400.431581ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.909525    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.106576    3827 request.go:629] Waited for 197.012349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106668    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.106677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.106682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.109021    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.305894    3827 request.go:629] Waited for 196.495089ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.305945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.306000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.306010    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.306018    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.308864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.309301    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.309311    3827 pod_ready.go:81] duration metric: took 399.779835ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.309324    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.504969    3827 request.go:629] Waited for 195.610894ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505066    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.505072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.505076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.507056    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:51.705447    3827 request.go:629] Waited for 197.942219ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705515    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.705522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.705527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.707999    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.708367    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.708379    3827 pod_ready.go:81] duration metric: took 399.049193ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.708391    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.906469    3827 request.go:629] Waited for 198.035792ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906531    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.906539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.906545    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.909082    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.105038    3827 request.go:629] Waited for 195.597271ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105087    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105095    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.105157    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.105168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.108049    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.108591    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:52.108604    3827 pod_ready.go:81] duration metric: took 400.204131ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:52.108615    3827 pod_ready.go:38] duration metric: took 26.314911332s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:52.108628    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:52.108680    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:52.120989    3827 api_server.go:72] duration metric: took 26.533695803s to wait for apiserver process to appear ...
	I0731 10:06:52.121002    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:52.121014    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:52.124310    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:52.124340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:52.124344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.124353    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.124358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.124912    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:52.124978    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:52.124989    3827 api_server.go:131] duration metric: took 3.981645ms to wait for apiserver health ...
	I0731 10:06:52.124994    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:52.305762    3827 request.go:629] Waited for 180.72349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305845    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305853    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.305861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.305872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.310548    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:52.315274    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:52.315286    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.315289    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.315292    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.315295    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.315298    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.315301    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.315303    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.315306    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.315311    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.315313    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.315316    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.315319    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.315322    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.315327    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.315330    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.315333    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.315335    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.315338    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.315341    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.315343    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.315346    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.315348    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.315350    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.315353    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.315358    3827 system_pods.go:74] duration metric: took 190.3593ms to wait for pod list to return data ...
	I0731 10:06:52.315363    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:52.505103    3827 request.go:629] Waited for 189.702061ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505178    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505187    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.505195    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.505199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.507558    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.507636    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:52.507644    3827 default_sa.go:55] duration metric: took 192.276446ms for default service account to be created ...
	I0731 10:06:52.507666    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:52.705427    3827 request.go:629] Waited for 197.710286ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705497    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.705519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.705526    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.711904    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:06:52.716760    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:52.716772    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.716777    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.716780    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.716783    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.716787    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.716790    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.716794    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.716798    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.716801    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.716805    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.716809    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.716813    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.716816    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.716819    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.716823    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.716827    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.716830    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.716833    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.716836    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.716854    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.716860    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.716864    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.716867    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.716871    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.716876    3827 system_pods.go:126] duration metric: took 209.203713ms to wait for k8s-apps to be running ...
	I0731 10:06:52.716881    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:52.716936    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:52.731223    3827 system_svc.go:56] duration metric: took 14.33545ms WaitForService to wait for kubelet
	I0731 10:06:52.731240    3827 kubeadm.go:582] duration metric: took 27.143948309s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:52.731255    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:52.906178    3827 request.go:629] Waited for 174.879721ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906213    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906218    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.906257    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.906264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.908378    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.909014    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909025    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909032    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909035    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909039    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909041    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909045    3827 node_conditions.go:105] duration metric: took 177.780993ms to run NodePressure ...
	I0731 10:06:52.909053    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:52.909067    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:52.931184    3827 out.go:177] 
	I0731 10:06:52.952773    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:52.952858    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:52.974676    3827 out.go:177] * Starting "ha-393000-m04" worker node in "ha-393000" cluster
	I0731 10:06:53.016553    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:53.016583    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:53.016766    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:53.016784    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:53.016901    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.017869    3827 start.go:360] acquireMachinesLock for ha-393000-m04: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:53.017982    3827 start.go:364] duration metric: took 90.107µs to acquireMachinesLock for "ha-393000-m04"
	I0731 10:06:53.018005    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:53.018013    3827 fix.go:54] fixHost starting: m04
	I0731 10:06:53.018399    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:53.018423    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:53.027659    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52069
	I0731 10:06:53.028033    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:53.028349    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:53.028359    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:53.028586    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:53.028695    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.028810    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:06:53.028891    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.028978    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 10:06:53.029947    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid 3095 missing from process table
	I0731 10:06:53.029967    3827 fix.go:112] recreateIfNeeded on ha-393000-m04: state=Stopped err=<nil>
	I0731 10:06:53.029982    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	W0731 10:06:53.030076    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:53.051730    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m04" ...
	I0731 10:06:53.093566    3827 main.go:141] libmachine: (ha-393000-m04) Calling .Start
	I0731 10:06:53.093954    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.094004    3827 main.go:141] libmachine: (ha-393000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid
	I0731 10:06:53.094113    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Using UUID 8a49f5e0-ba79-41ac-9a76-c032dc065628
	I0731 10:06:53.120538    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Generated MAC d2:d8:fb:1d:1:ee
	I0731 10:06:53.120559    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:53.120750    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120805    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120864    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "8a49f5e0-ba79-41ac-9a76-c032dc065628", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:53.120909    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 8a49f5e0-ba79-41ac-9a76-c032dc065628 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:53.120925    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:53.122259    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Pid is 3870
	I0731 10:06:53.122766    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 0
	I0731 10:06:53.122781    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.122872    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3870
	I0731 10:06:53.125179    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 10:06:53.125242    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:53.125254    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:06:53.125266    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:53.125273    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:53.125280    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:53.125287    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found match: d2:d8:fb:1d:1:ee
	I0731 10:06:53.125295    3827 main.go:141] libmachine: (ha-393000-m04) DBG | IP: 192.169.0.8
	I0731 10:06:53.125358    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetConfigRaw
	I0731 10:06:53.126014    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:06:53.126188    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.126707    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:53.126722    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.126959    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:06:53.127071    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:06:53.127158    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127274    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127389    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:06:53.127538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:53.127705    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:06:53.127713    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:53.131247    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:53.140131    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:53.141373    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.141406    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.141429    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.141447    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.528683    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:53.528699    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:53.643451    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.643474    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.643483    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.643491    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.644344    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:53.644357    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:59.241509    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:59.241622    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:59.241636    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:59.265250    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:07:04.190144    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:07:04.190159    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190326    3827 buildroot.go:166] provisioning hostname "ha-393000-m04"
	I0731 10:07:04.190338    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190427    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.190528    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.190617    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190711    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190826    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.190962    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.191110    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.191119    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m04 && echo "ha-393000-m04" | sudo tee /etc/hostname
	I0731 10:07:04.259087    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m04
	
	I0731 10:07:04.259102    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.259236    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.259339    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259439    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.259647    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.259797    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.259811    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:07:04.323580    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:07:04.323604    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:07:04.323616    3827 buildroot.go:174] setting up certificates
	I0731 10:07:04.323623    3827 provision.go:84] configureAuth start
	I0731 10:07:04.323630    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.323758    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:04.323858    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.323932    3827 provision.go:143] copyHostCerts
	I0731 10:07:04.323960    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324021    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:07:04.324027    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324150    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:07:04.324352    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324397    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:07:04.324402    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324482    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:07:04.324627    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324668    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:07:04.324674    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324752    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:07:04.324900    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m04 san=[127.0.0.1 192.169.0.8 ha-393000-m04 localhost minikube]
	I0731 10:07:04.518738    3827 provision.go:177] copyRemoteCerts
	I0731 10:07:04.518793    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:07:04.518809    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.518951    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.519038    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.519124    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.519202    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:04.553750    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:07:04.553834    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:07:04.574235    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:07:04.574311    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:07:04.594359    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:07:04.594433    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:07:04.614301    3827 provision.go:87] duration metric: took 290.6663ms to configureAuth
	I0731 10:07:04.614319    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:07:04.614509    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:04.614526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:04.614676    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.614777    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.614880    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.614987    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.615110    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.615236    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.615386    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.615394    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:07:04.672493    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:07:04.672505    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:07:04.672600    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:07:04.672612    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.672752    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.672835    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.672958    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.673042    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.673159    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.673303    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.673352    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:07:04.741034    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	Environment=NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:07:04.741052    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.741187    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.741288    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741387    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741494    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.741621    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.741755    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.741771    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:07:06.325916    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:07:06.325931    3827 machine.go:97] duration metric: took 13.199216588s to provisionDockerMachine
	I0731 10:07:06.325941    3827 start.go:293] postStartSetup for "ha-393000-m04" (driver="hyperkit")
	I0731 10:07:06.325948    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:07:06.325960    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.326146    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:07:06.326163    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.326257    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.326346    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.326438    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.326522    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.369998    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:07:06.375343    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:07:06.375359    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:07:06.375470    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:07:06.375663    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:07:06.375669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:07:06.375894    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:07:06.394523    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:06.415884    3827 start.go:296] duration metric: took 89.928396ms for postStartSetup
	I0731 10:07:06.415906    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.416074    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:07:06.416088    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.416193    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.416287    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.416381    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.416451    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.451487    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:07:06.451545    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:07:06.482558    3827 fix.go:56] duration metric: took 13.464545279s for fixHost
	I0731 10:07:06.482584    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.482724    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.482806    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482891    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482992    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.483122    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:06.483263    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:06.483270    3827 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0731 10:07:06.539713    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445626.658160546
	
	I0731 10:07:06.539725    3827 fix.go:216] guest clock: 1722445626.658160546
	I0731 10:07:06.539731    3827 fix.go:229] Guest: 2024-07-31 10:07:06.658160546 -0700 PDT Remote: 2024-07-31 10:07:06.482574 -0700 PDT m=+124.148842929 (delta=175.586546ms)
	I0731 10:07:06.539746    3827 fix.go:200] guest clock delta is within tolerance: 175.586546ms
	I0731 10:07:06.539751    3827 start.go:83] releasing machines lock for "ha-393000-m04", held for 13.521760862s
	I0731 10:07:06.539766    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.539895    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:06.564336    3827 out.go:177] * Found network options:
	I0731 10:07:06.583958    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	W0731 10:07:06.605128    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605143    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605170    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605183    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605593    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605717    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605786    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:07:06.605816    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	W0731 10:07:06.605831    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605845    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605864    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605930    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:07:06.605931    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.605944    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.606068    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606081    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.606172    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606197    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606270    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606322    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.606369    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	W0731 10:07:06.638814    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:07:06.638878    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:07:06.685734    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:07:06.685752    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.685831    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:06.701869    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:07:06.710640    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:07:06.719391    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:07:06.719452    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:07:06.728151    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.736695    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:07:06.745525    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.754024    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:07:06.762489    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:07:06.770723    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:07:06.779179    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:07:06.787524    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:07:06.795278    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:07:06.802833    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:06.908838    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:07:06.929085    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.929153    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:07:06.946994    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.958792    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:07:06.977007    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.987118    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:06.998383    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:07:07.019497    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:07.030189    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:07.045569    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:07:07.048595    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:07:07.055870    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:07:07.070037    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:07:07.166935    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:07:07.272420    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:07:07.272447    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:07:07.286182    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:07.397807    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:07:09.678871    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.281044692s)
	I0731 10:07:09.678935    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:07:09.691390    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:07:09.706154    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:09.718281    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:07:09.818061    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:07:09.918372    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.020296    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:07:10.034132    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:10.045516    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.140924    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:07:10.198542    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:07:10.198622    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:07:10.202939    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:07:10.203007    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:07:10.206254    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:07:10.238107    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:07:10.238184    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.256129    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.301307    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:07:10.337880    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:07:10.396169    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:07:10.454080    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	I0731 10:07:10.491070    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:10.491478    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:07:10.496573    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:10.506503    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:07:10.506687    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:10.506931    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.506954    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.515949    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52091
	I0731 10:07:10.516322    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.516656    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.516668    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.516893    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.517004    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:07:10.517099    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:07:10.517181    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:07:10.518192    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:07:10.518454    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.518477    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.527151    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52093
	I0731 10:07:10.527586    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.527914    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.527931    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.528158    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.528268    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:07:10.528367    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.8
	I0731 10:07:10.528374    3827 certs.go:194] generating shared ca certs ...
	I0731 10:07:10.528388    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:07:10.528576    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:07:10.528655    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:07:10.528666    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:07:10.528692    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:07:10.528712    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:07:10.528731    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:07:10.528834    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:07:10.528887    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:07:10.528897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:07:10.528933    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:07:10.528968    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:07:10.529000    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:07:10.529077    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:10.529114    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.529135    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.529152    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.529176    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:07:10.550191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:07:10.570588    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:07:10.590746    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:07:10.611034    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:07:10.631281    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:07:10.651472    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:07:10.671880    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:07:10.676790    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:07:10.685541    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689430    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689496    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.694391    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:07:10.703456    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:07:10.712113    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715795    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.720285    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:07:10.728964    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:07:10.737483    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741091    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741135    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.745570    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:07:10.754084    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:07:10.757225    3827 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 10:07:10.757258    3827 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.30.3 docker false true} ...
	I0731 10:07:10.757327    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:07:10.757375    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.764753    3827 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 10:07:10.764797    3827 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 10:07:10.772338    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 10:07:10.772398    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:07:10.772434    3827 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772437    3827 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.780324    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 10:07:10.780354    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 10:07:10.780356    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 10:07:10.780369    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 10:07:10.799303    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.799462    3827 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.842469    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 10:07:10.842511    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 10:07:11.478912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0731 10:07:11.486880    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:07:11.501278    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:07:11.515550    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:07:11.518663    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:11.528373    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.625133    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:11.645175    3827 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0731 10:07:11.645375    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:11.651211    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:07:11.692705    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.797111    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:12.534860    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:07:12.535084    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:07:12.535128    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:07:12.535291    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:07:12.535335    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:12.535339    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:12.535359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:12.535366    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:12.537469    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.035600    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.035613    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.035620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.035622    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.037811    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.536601    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.536621    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.536630    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.536636    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.539103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.035926    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.035943    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.035952    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.035957    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.038327    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.535691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.535719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.538107    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.538174    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:15.035707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.035739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.037991    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:15.535587    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.535602    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.535658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.535663    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.537787    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.035475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.035497    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.035550    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.035555    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.037882    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.536666    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.536687    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.536712    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.536719    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.538904    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:17.035473    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.035488    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.035495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.035498    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.037610    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:17.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.536074    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.536089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.536096    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.539102    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.035624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.035646    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.035652    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.037956    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.535491    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.535589    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.535603    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.535610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.538819    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:18.538965    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:19.036954    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.037007    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.037028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.037033    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.039345    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:19.536847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.536862    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.536870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.536873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.538820    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.037064    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.037079    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.037086    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.037089    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.038945    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.536127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.536138    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.536145    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.536150    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:21.036613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.036684    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.036695    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.036701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.039123    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:21.039186    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:21.536684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.536700    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.536705    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.536708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.538918    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:22.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.036736    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.036743    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.036746    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.038627    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:22.536686    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.536704    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.536714    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.536718    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.538549    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:23.036470    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.036482    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.036489    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.036494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.038533    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:23.535581    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.535639    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.535653    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.535667    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.539678    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:23.539740    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:24.036874    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.036948    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.036959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.036965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.039843    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:24.536241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.536307    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.536318    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.536323    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.538807    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.036279    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.036343    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.036356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.036362    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.038454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.535942    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.535954    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.535962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.535967    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.538068    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.036823    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.036838    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.036845    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.036848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.038942    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.039008    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:26.535480    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.535499    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.535533    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:27.036202    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.036213    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.036219    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.036222    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.038071    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:27.537206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.537226    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.537236    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.537248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.539573    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.036203    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.036217    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.036223    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.036225    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.038017    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:28.536971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.536988    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.536998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.537003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.539378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.539442    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:29.035655    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.035667    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.035673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.035676    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.037786    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:29.537109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.537124    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.537144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.539430    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:30.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.035905    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.035908    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.037803    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:30.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.535701    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.535718    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.539029    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:31.036151    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.036166    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.036175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.038532    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:31.038593    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:31.536698    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.536710    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.538484    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.037162    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.037178    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.037185    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.037188    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.039081    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.536065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.536085    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.536095    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.536099    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.538365    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.036492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.036513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.036523    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.036527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.038851    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.038919    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:33.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.535576    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.537575    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:34.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.036912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.036923    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.036932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.040173    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:34.535858    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:35.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.036670    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.036677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.036682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.038861    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:35.038930    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:35.535814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.535827    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.535835    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.535840    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.538360    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.035769    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.035785    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.038202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.535426    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.535438    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.535445    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.535449    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.537303    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:37.035456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.035470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.035479    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.035483    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.037630    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.536548    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.536562    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.536568    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.536572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.538659    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.538720    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:38.036407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.036421    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.036427    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.036432    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.038467    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:38.537359    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.537378    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.537387    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.537392    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.539892    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:39.036414    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.036470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.036495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:39.535817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.535832    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.535839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.537796    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.035880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.035896    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.035906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.037712    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:40.535492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.535523    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.535536    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.538475    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:41.035745    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.035758    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.035770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.035774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:41.535726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.535738    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.535744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.535747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.537897    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.036564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.036573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.039525    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.039600    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:42.535450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.535465    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.537399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:43.035576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.035592    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.035598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.035602    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:43.536787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.536822    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.536832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.536837    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.539146    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.036148    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.036161    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.036169    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.036173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.038382    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.536653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.536709    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.538695    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:44.538753    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:45.036650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.036662    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.036668    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.036672    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.038555    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:45.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.535582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.535590    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.538335    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.035712    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.035740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.038035    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.535534    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.535557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.535564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.537974    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:47.035871    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.035887    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.035893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.035897    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.037864    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:47.037931    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:47.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.535564    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.535570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.535573    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.537590    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:48.035461    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.035531    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.035539    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.035543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.037510    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:48.536520    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.536535    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.536541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.536544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.538561    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:49.035436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.035448    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.035454    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.035458    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.037204    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.535574    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.535592    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.535595    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.537443    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.537505    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:50.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.036547    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.036566    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.038478    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:50.536624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.536636    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.536642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.538734    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.036016    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.036035    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.036044    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.036049    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.038643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.536662    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.536677    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.536686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.536691    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.539033    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.539099    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:52.036475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.036490    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.036499    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.036503    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.038975    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:52.537013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.537034    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.537041    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.537045    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.539229    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.037093    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.037106    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.037113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.037117    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.039169    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.536468    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.536478    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.536486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.539425    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.539565    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:54.035597    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.035609    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.035615    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.037574    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:54.535484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.535503    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.535509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.537529    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.036258    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.036270    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.036277    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.036280    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.038186    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:55.536493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.536513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.536533    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.539517    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.539589    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:56.035565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.035586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.035599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.040006    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:07:56.536361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.536374    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.536380    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.536383    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.538540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:57.036446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.036544    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.036567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.039754    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:57.536620    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.536630    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.536637    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.536639    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.538482    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:58.036499    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.036518    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.036527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.036532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.039244    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:58.039325    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:58.537076    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.537105    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.537197    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.537204    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.539718    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:59.037046    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.037127    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.037142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.037149    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.040197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:59.536758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.536790    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.536798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.536802    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.538842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.035440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.035453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.035460    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.035463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.037506    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.536873    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.536895    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.536906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.536913    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.540041    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:00.540123    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:01.036175    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.036225    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.036239    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.036248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.039214    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:01.535960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.535973    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.535979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.535983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.538089    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.036835    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.036856    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.036875    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.039802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.536660    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.536667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.536670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.538840    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.036159    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.036181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.036184    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.038276    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.038354    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:03.536974    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.536990    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.536996    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.537000    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.538828    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:04.036300    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.036363    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.036391    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.038707    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:04.535718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.535737    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.535749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.538366    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.036299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.036316    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.036350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.036354    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.038568    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:05.535824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.535837    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.535846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.537780    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:06.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.036592    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.036607    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.036612    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.038642    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:06.535656    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.535670    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.535679    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.535682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.538248    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.036322    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.036396    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.036407    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.036412    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.038943    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.039003    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:07.536357    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.536370    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.536379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.536384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.538778    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.036360    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.036375    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.036381    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.036384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.038393    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.536197    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.536266    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.538997    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.036883    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.036911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.036918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.036922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.039071    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.039137    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:09.535649    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.535664    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.535673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.535677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.537998    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.036229    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.036241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.036247    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.039273    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.536564    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.536575    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.536585    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.538369    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:11.036693    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.036710    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.036749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.038831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.535438    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.535452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.535461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.537490    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.537597    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:12.035786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.035805    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.035812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.035816    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.038145    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:12.536840    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.536858    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.536868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.536881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.538815    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.037034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.037049    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.037056    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.037059    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.038933    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.535502    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.535519    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.535593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.537560    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.537648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:14.036280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.036300    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.036312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.036322    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.039000    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:14.535507    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.535527    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.535537    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.538228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.036543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.036634    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.036643    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.039762    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:15.535993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.536006    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.536012    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.536015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.538186    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.538254    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:16.035582    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.035595    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.035602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:16.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.536663    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.536709    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.536713    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.538604    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:17.036351    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.036372    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.036393    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.039451    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:17.536542    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.536560    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.536573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.539454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:17.539591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:18.036512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.036578    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.036588    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.038886    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:18.535537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.535554    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.535559    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.537559    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:19.035943    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.035968    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.035980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.035987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.038665    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:19.536893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.536911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.536920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.536925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.539416    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.036463    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.036479    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.036495    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.036500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.038824    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.038907    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:20.536286    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.536306    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.536313    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.536316    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.538429    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.036034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.036055    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.038101    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.535690    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.535732    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.535740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.538264    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.036592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.036604    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.036610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.038773    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.536090    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.536103    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.536109    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.536114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.537988    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:22.538057    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:23.035526    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.035555    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.035562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.035567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.037480    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:23.536652    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.536666    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.536673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.536677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.538667    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.036746    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.036766    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.036778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.036789    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.039353    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:24.536440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.536452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.536459    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.536463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.538250    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.538315    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:25.036622    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.036643    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.036656    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.036666    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.039764    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:25.535710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.535721    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.535737    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.535742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:26.036253    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.036276    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.036338    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.036343    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.038674    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.536815    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.536828    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.536834    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.536838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.538932    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:27.035852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.035864    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.035869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.035872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.038024    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:27.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.536016    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.536028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.536036    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.539189    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:28.035934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.036002    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.036011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.036014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.037996    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:28.535538    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.535554    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.535561    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.535563    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:29.037018    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.037032    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.037039    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.037042    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.038983    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:29.039043    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:29.535757    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.535769    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.535775    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.535778    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.537697    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:30.036529    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.036548    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.036557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.038833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:30.535560    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.535570    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.535576    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.535579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.537657    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.035508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.035520    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.035527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.035531    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.037575    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.536786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.536800    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.536806    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.536809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.538674    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:31.538731    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:32.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.035833    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.035842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.035848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.038170    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:32.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.535471    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.535481    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.537802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.037123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.037156    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.037166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.037171    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.039252    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.535754    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.535760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.535763    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.537979    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.035638    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.035651    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.035658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.035661    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.037722    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:34.535808    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.535823    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.535831    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.535834    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.538223    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:35.036584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.036609    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.036620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.036625    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.039788    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:35.535720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.535732    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.535738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.535741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.537506    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:36.036439    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.036484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.036492    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.036498    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.038534    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:36.038591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:36.535446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.535465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.535467    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.537309    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:37.035737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.035776    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.035789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.035794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.037928    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:37.535410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.535422    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.535430    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.535433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.537627    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.036658    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.036738    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.036760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.039378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:38.535459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.535474    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.535490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.535494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.537817    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.036931    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.036949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.036957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.036962    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.039286    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.536472    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.536487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.536491    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.538440    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:40.036354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.036378    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.036463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.036469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.535847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.535866    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.535883    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.538740    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.538822    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:41.036206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.036229    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.036234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.038292    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:41.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.535753    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.535764    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.537837    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.036558    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.036566    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.036570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.039104    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.536474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.536484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.536491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.538339    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:43.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.035913    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.035925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.035931    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.038963    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:43.039028    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:43.537036    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.537050    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.537056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.537059    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.539282    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:44.035937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.035949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.035954    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.035958    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.037693    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:44.536399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.536470    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.536481    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.536485    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.538818    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.036937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.036960    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.036966    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.039449    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:45.535403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.535415    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.535421    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.535424    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.537208    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:46.037001    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.037088    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.037104    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.037110    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.040342    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:46.536255    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.536269    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.538801    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:47.037251    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.037286    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.037297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.037304    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.039048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.537021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.537064    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.537071    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.537076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.539084    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.539154    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:48.037354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.037369    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.037376    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.037379    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.039646    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:48.536219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.536236    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.536272    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.536276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.538242    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:49.035446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.035459    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.035465    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.035469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.037563    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:49.535517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.535540    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.537433    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:50.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.036659    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.036665    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.036670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.038735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:50.038803    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:50.535659    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.535678    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.535690    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.535697    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.538598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.036768    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.036782    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.036789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.036794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.038898    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.536592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.536608    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.536616    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.536621    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.539087    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:52.036618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.036639    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.036652    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.036658    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.039828    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:52.039911    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:52.535902    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.537950    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.036705    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.036716    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.036721    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.039002    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.535467    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.535473    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.535476    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.537615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.036291    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.036325    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.036406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.036414    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.039211    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.535751    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.535763    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.535769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.535772    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.537488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:54.537606    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:55.036966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.036982    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.036988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.038791    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:55.537260    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.537303    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.537312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.537315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.539579    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.036346    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.036359    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.036367    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.036370    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.038527    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.536015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.536055    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.536063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.536068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:56.538106    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:57.036625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.036637    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.036646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.038481    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:57.536731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.536744    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.536749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:58.037081    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.037160    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.037174    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.037182    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.040222    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:58.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.535453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.535460    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.535463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.537373    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:59.037130    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.037151    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.037161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.037181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.039237    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:59.039342    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:59.536756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.536768    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.536774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.536777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.538430    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:00.036701    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.036714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.036720    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.036723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.038842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:00.535558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.535574    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.535620    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.535625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.537993    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.036274    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.036293    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.036302    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.036305    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.038700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.536455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.536488    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.536511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.538672    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.538736    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:02.036272    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.036286    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.036291    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.036295    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:02.535392    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.535405    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.535416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.535419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.537336    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.036249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.036264    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.036271    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.036276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.038181    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.536990    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.537012    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.537020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.537024    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.541054    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:09:03.541125    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:04.036809    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.036887    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.036896    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.036902    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.039202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:04.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.537152    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.537166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.537904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.540615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.036817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.036832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.036838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.036842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.038865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.535412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.535430    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.535438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.535446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.538103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.036140    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.036160    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.036172    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.039025    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:06.536908    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.536923    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.536930    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.536933    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.538854    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:07.035951    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.035965    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.035974    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.035979    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.038105    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:07.535618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.535629    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.535635    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.535637    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.537552    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:08.036184    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.036212    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.036273    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.036279    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.038850    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.536040    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.536056    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.536065    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.536069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.538402    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.538460    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:09.036971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.037018    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.037025    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.037031    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.039100    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:09.535468    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.535480    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.535490    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.537589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.035464    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.035479    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.035491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.035506    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.037831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.536550    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.536622    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.536632    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.536638    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.539005    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.539064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:11.037316    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.037399    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.037415    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.037425    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.040113    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:11.536965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.536989    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.537033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.537044    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.539689    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:12.036399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.036469    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.036480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.036486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.038399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:12.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.535463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.535486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.539207    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:12.539333    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:13.036110    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.036220    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.036236    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:13.535970    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.535990    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.536002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.536008    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.539197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:14.037193    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.037263    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.037274    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.037286    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.039603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:14.535571    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.535594    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.036611    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.036630    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.036642    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.036648    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.039592    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.039739    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:15.535565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.535590    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.535602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.535608    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.539127    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.035884    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.035904    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.035915    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.035919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.038938    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.535882    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.535893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.535900    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.535904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.537836    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:17.036590    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.036605    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.036618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.039082    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:17.535436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.535454    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:17.539295    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:18.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.035505    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.035509    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.037946    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:18.536869    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.536890    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.538941    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:19.035847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.035859    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.035865    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.035868    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.037761    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:19.536117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.536142    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.536154    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.536160    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:19.539466    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:20.036919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.036993    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.037004    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.037009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.039230    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:20.536619    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.536716    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.536731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.536738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.539591    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.036024    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.036114    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.036129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.036136    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.038666    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.535434    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.535447    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.535453    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.535457    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.537251    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:22.037204    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.037219    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.037228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.037234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.039524    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:22.039581    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:22.536431    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.536450    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.536464    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.536473    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.539233    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.035562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.035606    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.035627    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.035634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.037971    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.536675    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.536742    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.539879    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:24.035514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.035529    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.035535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.035544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.037431    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:24.536058    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.536156    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.536171    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.536179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.538730    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:24.538810    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:25.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.036804    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.036814    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.036821    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.039117    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:25.535569    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.535587    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.535596    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.538114    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.035517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.035542    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.035556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.035562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.038485    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.536365    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.536379    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.536386    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.536390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.538690    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:27.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.036652    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.036703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.036709    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.038432    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:27.038498    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:27.535539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.535560    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.535580    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.538434    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.035626    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.035644    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.035647    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.037699    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.536177    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.536199    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.536212    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.536217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.539218    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:29.036925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.036950    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.036962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.036969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.040007    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:29.040064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:29.537194    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.537209    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.537228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.537240    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.539598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.036373    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.036494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.039302    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.536789    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.536807    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.536815    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.536820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.539885    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.036624    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.036635    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.039815    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.536285    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.536295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.536301    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.538680    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:31.538744    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:32.036451    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.036463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.036469    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.036472    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.038847    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:32.536969    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.537019    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.537032    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.537041    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.539636    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.035557    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.035573    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.035582    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.035587    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.535485    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.535509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.535522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.535529    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.538268    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.035811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.035830    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.035841    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.035846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.038580    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.038645    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:34.535515    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.535562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.537523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.036865    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.036880    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.036887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.036890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.038894    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.535476    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.535574    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.535579    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.537495    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:36.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.036227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.038994    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:36.039061    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:36.536105    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.536117    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.536124    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.536127    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.036134    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.536082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.536101    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.536110    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.536114    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.538459    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.035493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.035509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.035517    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.035524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.037791    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.535613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.535632    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.535645    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.535668    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.539185    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:38.539281    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:39.036660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.036682    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.036693    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.036700    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.039452    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:39.535986    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.536000    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.536007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.536011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.537968    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:40.036939    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.037010    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.037021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.037026    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.039435    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:40.536149    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.536171    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.536233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.536239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.538338    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.036629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.036641    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.036647    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.036651    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.038835    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.038897    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:41.536269    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.536280    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.536287    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.536290    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.538277    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:42.036495    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.036511    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.036520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.036524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.038560    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:42.537182    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.537201    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.537210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.537215    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.539833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.035857    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.035874    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.035881    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.035891    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.038530    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.536377    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.536465    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.536480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.536488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.539159    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.539217    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:44.036979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.037065    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.037081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.037089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.039312    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:44.536993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.537011    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.537018    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.537063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.539131    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.036929    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.037050    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.037064    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.039700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.537112    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.537123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.539940    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.540011    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:46.036811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.036857    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.036882    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.039540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:46.535831    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.535845    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.535852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.535856    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.538387    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:47.036117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.036128    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.036134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.036137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.037871    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:47.536504    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.536553    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.536564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.536568    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:48.036960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.036980    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.036998    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.040512    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:48.041066    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:48.535514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.535532    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.535542    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.535547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.537881    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.036133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.038899    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.536876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.536893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.536899    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.538675    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.037190    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.037204    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.037213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.037216    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.039015    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.536824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.536920    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.536935    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.536942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.539735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:50.539808    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:51.035683    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.035696    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.035702    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.035706    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.038883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:51.536861    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.536882    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.536894    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.536901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.539779    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:52.035474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.035485    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.035493    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.035499    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.037401    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:52.536642    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.536661    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.536669    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.536674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.036427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.036482    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.036487    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.038951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.039010    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:53.535427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.535450    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.537257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:54.036806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.036828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.036832    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.039021    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:54.535805    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.535897    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.535912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.535919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.538990    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:55.036521    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.036539    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.036546    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.036549    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.038766    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.536714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.536723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.536727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.539055    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.539163    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:56.035522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.035534    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.035541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.035545    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:56.535916    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.535934    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.535943    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.535949    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.538329    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:57.036391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.036406    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.036413    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.036417    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.038267    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:57.535390    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.535447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.535452    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.537243    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.036778    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.036805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.036809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.038620    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.038682    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:58.536471    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.536516    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.536532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.538643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:59.035837    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.035851    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.035858    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.035861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.037705    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:59.536730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.536832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.536848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.536854    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.539682    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.035558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.035587    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.035600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.035612    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.037523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:00.535512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.535528    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.535534    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.535537    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.537603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.537667    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:01.036888    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.036943    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.036951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.036955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.038774    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:01.535488    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.535504    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.535513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.535517    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.538017    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.036031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.036054    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.037488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:02.537218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.537285    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.537295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.537300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.539559    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.539701    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:03.036241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.036256    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.036263    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.036269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.037763    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:03.536877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.536892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.536901    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.539168    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:04.035721    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.035733    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.035739    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.035742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.037607    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:04.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.535694    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.535703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.535707    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.537920    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:05.037180    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.037195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.037201    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.037205    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:05.038947    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:05.536233    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.536248    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.536254    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.536258    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.538191    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.036830    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.036845    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.036852    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.036856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.038427    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.536722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.536735    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.536741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.536753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.538631    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.036171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.036186    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.036192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.036195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.038330    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:07.536466    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.536481    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.536488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.538446    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.538510    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:08.036787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.036832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.036853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.039084    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:08.535567    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.535582    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.535589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.535593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.537711    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.035421    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.035432    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.035438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.035442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.037921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.535887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.535904    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.535913    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.535943    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.538516    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.538592    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:10.035458    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.035469    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.035474    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.035477    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.038652    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:10.535979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.535992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.535998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.536002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.537981    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:11.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.035886    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.035897    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.035901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.038043    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:11.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.535487    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.535494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.535497    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.537395    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:12.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.036591    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.036598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.036601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.038621    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:12.038676    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:12.536927    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.536941    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.536947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.536952    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.539050    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:13.036386    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.036399    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.036428    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.036433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.038022    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:13.536356    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.536376    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.536403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.536406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.035960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.035973    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.035979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.035983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.037566    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.535889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.535909    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.535920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.535926    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:14.538873    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:15.037263    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.037278    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.037284    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.037291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.038934    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:15.535930    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.535949    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.535957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.535961    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.538412    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:16.035774    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.035790    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.035798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.035803    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.037617    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:16.536338    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.536352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.536359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.536362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.538545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.036602    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.036625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.039042    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:17.535886    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.535901    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.535907    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.535910    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.538060    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:18.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.036938    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.036947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.036950    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.038702    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:18.535556    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.535580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.535586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.537620    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.035993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.036009    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.036017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.036021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.536410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.536433    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.536444    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.536452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.539613    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:19.539694    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:20.035430    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.035445    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.035456    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.035466    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.037008    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:20.536812    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.536836    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.536849    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.536855    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.539846    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.035746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.035755    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.035761    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.037893    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.536119    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.536158    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.536173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.536181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.035742    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.035796    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.038072    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.038175    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:22.536977    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.536992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.536999    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.537002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.539319    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:23.036522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.036538    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.036544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.036547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.038326    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:23.537176    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.537194    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.537202    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.537208    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.539537    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:24.036672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.036686    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.036692    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.036696    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.038290    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:24.038347    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:24.536490    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.536508    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.536519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.536525    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.539462    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:25.036309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.036323    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.036329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.036332    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.038173    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:25.535523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.535539    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.535547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.535552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.538454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:26.035663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.035681    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.035719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.035722    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.037593    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.536821    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.536893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.538841    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.538912    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:27.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.036734    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.036740    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.036743    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.038648    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:27.537059    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.537079    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.537111    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.537116    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.539595    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:28.035398    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.035411    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.035417    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.035421    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.037116    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:28.536047    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.536115    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.536125    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.536133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.538589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.036033    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.036048    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.036055    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.036058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.038794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.038860    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:29.536173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.536187    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.536193    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.536198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.538161    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:30.036950    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.037050    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.037065    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.037072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.039996    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:30.536407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.536424    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.036484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.036581    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.036600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.039439    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:31.535848    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.535863    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.535872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.036070    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.036083    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.036092    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.036097    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.038358    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.535559    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.535583    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.535597    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.535604    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.538962    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:33.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.035880    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.035887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.035890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.038234    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:33.536345    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.536363    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.536408    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.536413    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.538408    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:33.538470    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:34.035876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.035911    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.035917    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.038813    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:34.535532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.535555    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.535599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.535611    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.036525    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.036545    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.036557    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.036565    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.039453    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.536317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.536338    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.536346    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.536351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.538546    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.538604    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:36.035614    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.035632    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.035642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.035648    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.037951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:36.535593    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.535610    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.535620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.535627    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.538091    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.035952    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.035972    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.035984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.035992    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.039078    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:37.536397    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.536416    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.536425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.536431    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.538652    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.538721    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:38.036647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.036688    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.036697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.036702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.038657    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:38.535391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.535469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.535474    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.537747    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:39.036877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.036896    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.036908    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.036916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.039937    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.537361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.537463    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.537475    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.537480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.540492    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.540575    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:40.035736    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.035759    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.035797    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.035817    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.038896    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:40.536124    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.536136    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.536142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.536147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.538082    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:41.036456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.036502    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.036513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.036519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.038631    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:41.535516    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.535529    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.535535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.035758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.035795    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.038565    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.038648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:42.536775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.536801    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.536856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.536867    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.539883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:43.036733    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.036747    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.036754    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.036758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.038792    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:43.536704    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.536719    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.536725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.536730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.538830    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.037317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.037342    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.037351    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.037356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.040355    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.040430    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:44.537337    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.537352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.537358    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.537362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.539426    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.036153    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.036187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.036193    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.039178    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.535572    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.535584    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.535596    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.537420    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:46.037146    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.037161    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.037168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.037199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.039539    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.536761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.536842    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.536857    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.536863    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.539600    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.539683    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:47.037209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.037228    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.037237    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.037243    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.039381    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:47.536097    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.536127    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.536138    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.536143    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.540045    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:48.035580    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.035598    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.035610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.037609    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:48.535945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.535960    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.535966    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.535969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.537852    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:49.036904    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.036928    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.036941    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.036946    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.039794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:49.039868    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:49.536635    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.536649    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.536699    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.536704    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.035497    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.035500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.037398    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.536222    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.536321    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.536335    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.536342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.035748    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.035813    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.035820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.536457    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.536471    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.536480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.538865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.538935    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:52.036481    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.036503    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.036583    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.036593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.039545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:52.536583    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.536620    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.536636    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.539115    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:53.037214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.037226    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.037256    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.037262    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.039257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:53.535880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.535892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.535898    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.535901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.538097    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.035680    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.035691    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.035697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.035702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.037758    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.037819    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:54.536181    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.536195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.536250    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.536256    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.538069    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:55.036750    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.036858    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.036874    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.036881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.040140    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:55.535731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.535746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.535752    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.535755    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.537710    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:56.037367    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.037382    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.037392    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.037396    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.039716    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:56.039828    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:56.535738    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.535750    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.535757    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.535760    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.537553    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:57.036797    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.036852    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.036859    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.036862    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.038921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:57.535419    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.535437    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.535452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.035459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.035475    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.035484    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.035488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.037963    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.536607    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.536625    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.536640    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.536653    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.539173    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.539233    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:59.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.035890    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.035912    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:59.535411    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.535426    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.535432    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.535434    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.537913    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.036663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.036679    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.036686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.036690    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.038915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.536586    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.536602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.536610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.536615    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.538823    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.037017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.037041    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.037053    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.037058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.039885    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.039956    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:01.537010    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.537022    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.537028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.537032    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.538870    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:02.036801    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.036819    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.036827    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.036831    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.039277    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:02.535479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.535495    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.535501    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.535505    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.537168    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:03.037023    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.037069    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.037079    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.037084    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.536060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.536073    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.536079    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.536083    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.539021    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:04.036364    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.036379    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.036390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:04.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.536251    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.536260    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.536264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.538409    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:05.035688    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.035701    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.035708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.035712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.037474    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:05.535639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.535661    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.535671    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.535676    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.538235    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.036540    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.036564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.039139    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.039201    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:06.536852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.536867    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.536875    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.536879    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.539160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:07.037400    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.037412    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.037419    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.037422    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.039316    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:07.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.535496    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.535507    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.538665    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:11:08.035588    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.035602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.035609    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.035614    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.037450    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.535606    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.535617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.535624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.535628    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.537643    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.537700    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:09.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.036549    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.036556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.036560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.038511    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:09.536726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.536794    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.536805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.536810    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.036626    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.038891    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.535919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.535991    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.536003    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.536009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.538198    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.538256    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:11.035775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.035789    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.037602    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:11.535963    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.535977    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.535984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.535988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.035422    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.035494    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.035509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.035514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.037902    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.536484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.536500    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.536506    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.536510    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.538333    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:12.538392    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:12.538407    3827 node_ready.go:38] duration metric: took 4m0.003142979s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:11:12.560167    3827 out.go:177] 
	W0731 10:11:12.580908    3827 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0731 10:11:12.580926    3827 out.go:239] * 
	* 
	W0731 10:11:12.582125    3827 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:11:12.680641    3827 out.go:177] 

                                                
                                                
** /stderr **
ha_test.go:562: failed to start cluster. args "out/minikube-darwin-amd64 start -p ha-393000 --wait=true -v=7 --alsologtostderr --driver=hyperkit " : exit status 80
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:244: <<< TestMultiControlPlane/serial/RestartCluster FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/RestartCluster]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (3.491348964s)
helpers_test.go:252: TestMultiControlPlane/serial/RestartCluster logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node start m02 -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:59 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000 -v=7               | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-393000 -v=7                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT | 31 Jul 24 10:00 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	| node    | ha-393000 node delete m03 -v=7       | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | ha-393000 stop -v=7                  | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT | 31 Jul 24 10:05 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true             | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:05 PDT |                     |
	|         | -v=7 --alsologtostderr               |           |         |         |                     |                     |
	|         | --driver=hyperkit                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 10:05:02
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 10:05:02.368405    3827 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:05:02.368654    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368660    3827 out.go:304] Setting ErrFile to fd 2...
	I0731 10:05:02.368664    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368853    3827 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:05:02.370244    3827 out.go:298] Setting JSON to false
	I0731 10:05:02.392379    3827 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2072,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:05:02.392490    3827 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:05:02.414739    3827 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:05:02.457388    3827 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:05:02.457417    3827 notify.go:220] Checking for updates...
	I0731 10:05:02.499271    3827 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:02.520330    3827 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:05:02.541352    3827 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:05:02.562183    3827 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:05:02.583467    3827 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:05:02.605150    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:02.605829    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.605892    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.615374    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51985
	I0731 10:05:02.615746    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.616162    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.616171    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.616434    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.616563    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.616815    3827 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:05:02.617053    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.617075    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.625506    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51987
	I0731 10:05:02.625873    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.626205    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.626218    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.626409    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.626526    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.655330    3827 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:05:02.697472    3827 start.go:297] selected driver: hyperkit
	I0731 10:05:02.697517    3827 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclas
s:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersio
n:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.697705    3827 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:05:02.697830    3827 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.698011    3827 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:05:02.707355    3827 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:05:02.711327    3827 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.711347    3827 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:05:02.714056    3827 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:05:02.714115    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:02.714124    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:02.714208    3827 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.714310    3827 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.756588    3827 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:05:02.778505    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:02.778576    3827 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:05:02.778606    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:02.778797    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:02.778816    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:02.779007    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.779936    3827 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:02.780056    3827 start.go:364] duration metric: took 96.562µs to acquireMachinesLock for "ha-393000"
	I0731 10:05:02.780090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:02.780107    3827 fix.go:54] fixHost starting: 
	I0731 10:05:02.780518    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.780547    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.789537    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51989
	I0731 10:05:02.789941    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.790346    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.790360    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.790582    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.790683    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.790784    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:02.790882    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.790960    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:05:02.791917    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 3685 missing from process table
	I0731 10:05:02.791950    3827 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:05:02.791969    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:05:02.792054    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:02.834448    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:05:02.857592    3827 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:05:02.857865    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.857903    3827 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:05:02.857999    3827 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:05:02.972788    3827 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:05:02.972822    3827 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:02.973002    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973031    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973095    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:02.973143    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:02.973162    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:02.974700    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Pid is 3840
	I0731 10:05:02.975089    3827 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:05:02.975104    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.975174    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:02.977183    3827 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:05:02.977235    3827 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:02.977252    3827 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66aa6ebd}
	I0731 10:05:02.977264    3827 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:05:02.977271    3827 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:05:02.977358    3827 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:05:02.978043    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:02.978221    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.978639    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:02.978649    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.978783    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:02.978867    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:02.978959    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979081    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979169    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:02.979279    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:02.979484    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:02.979495    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:02.982358    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:03.035630    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:03.036351    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.036364    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.036371    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.036377    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.417037    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:03.417051    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:03.531673    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.531715    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.531732    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.531747    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.532606    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:03.532629    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:09.110387    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:09.110442    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:09.110451    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:09.135557    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:12.964386    3827 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:05:16.034604    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:16.034620    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034750    3827 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:05:16.034759    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034882    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.034984    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.035084    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035183    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035281    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.035421    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.035570    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.035579    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:05:16.113215    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:05:16.113236    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.113381    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.113518    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113636    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113755    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.113885    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.114075    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.114086    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:16.184090    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:16.184121    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:16.184150    3827 buildroot.go:174] setting up certificates
	I0731 10:05:16.184163    3827 provision.go:84] configureAuth start
	I0731 10:05:16.184170    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.184309    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:16.184430    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.184520    3827 provision.go:143] copyHostCerts
	I0731 10:05:16.184558    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184631    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:16.184638    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184770    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:16.184969    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185016    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:16.185020    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185099    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:16.185248    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185290    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:16.185295    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185376    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:16.185533    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:05:16.315363    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:16.315421    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:16.315435    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.315558    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.315655    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.315746    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.315837    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:16.355172    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:16.355248    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:16.374013    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:16.374082    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:05:16.392556    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:16.392614    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:16.411702    3827 provision.go:87] duration metric: took 227.524882ms to configureAuth
	I0731 10:05:16.411715    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:16.411879    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:16.411893    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:16.412059    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.412155    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.412231    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412316    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412388    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.412496    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.412621    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.412628    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:16.477022    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:16.477033    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:16.477102    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:16.477118    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.477251    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.477356    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477432    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477517    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.477641    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.477778    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.477823    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:16.554633    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:16.554652    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.554788    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.554883    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.554976    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.555060    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.555183    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.555333    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.555346    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:18.220571    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:18.220585    3827 machine.go:97] duration metric: took 15.241941013s to provisionDockerMachine
	I0731 10:05:18.220598    3827 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:05:18.220606    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:18.220616    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.220842    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:18.220863    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.220962    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.221049    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.221130    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.221229    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.266644    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:18.270380    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:18.270395    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:18.270494    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:18.270687    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:18.270693    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:18.270912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:18.279363    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:18.313374    3827 start.go:296] duration metric: took 92.765768ms for postStartSetup
	I0731 10:05:18.313403    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.313592    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:18.313611    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.313704    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.313791    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.313881    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.313968    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.352727    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:18.352783    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:18.406781    3827 fix.go:56] duration metric: took 15.626681307s for fixHost
	I0731 10:05:18.406809    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.406951    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.407051    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407152    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407242    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.407364    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:18.407503    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:18.407510    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:05:18.475125    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445518.591979627
	
	I0731 10:05:18.475138    3827 fix.go:216] guest clock: 1722445518.591979627
	I0731 10:05:18.475144    3827 fix.go:229] Guest: 2024-07-31 10:05:18.591979627 -0700 PDT Remote: 2024-07-31 10:05:18.406799 -0700 PDT m=+16.073052664 (delta=185.180627ms)
	I0731 10:05:18.475163    3827 fix.go:200] guest clock delta is within tolerance: 185.180627ms
	I0731 10:05:18.475167    3827 start.go:83] releasing machines lock for "ha-393000", held for 15.69510158s
	I0731 10:05:18.475186    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475358    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:18.475493    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475894    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476002    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476070    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:18.476101    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476134    3827 ssh_runner.go:195] Run: cat /version.json
	I0731 10:05:18.476146    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476186    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476210    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476297    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476335    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476385    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476425    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476484    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.476507    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.560719    3827 ssh_runner.go:195] Run: systemctl --version
	I0731 10:05:18.565831    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:05:18.570081    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:18.570125    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:18.582480    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:18.582493    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.582597    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.598651    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:18.607729    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:18.616451    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:18.616493    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:18.625351    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.634238    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:18.643004    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.651930    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:18.660791    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:18.669545    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:18.678319    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:18.687162    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:18.695297    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:18.703279    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:18.796523    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:18.814363    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.814439    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:18.827366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.839312    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:18.855005    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.866218    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.877621    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:18.902460    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.913828    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.928675    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:18.931574    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:18.939501    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:18.952896    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:19.047239    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:19.144409    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:19.144484    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:19.159518    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:19.256187    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:21.607075    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.350869373s)
	I0731 10:05:21.607140    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:21.618076    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:21.632059    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.642878    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:21.739846    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:21.840486    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:21.956403    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:21.971397    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.982152    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.074600    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:22.139737    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:22.139811    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:22.144307    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:22.144354    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:22.147388    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:22.177098    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:22.177167    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.195025    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.255648    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:22.255698    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:22.256066    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:22.260342    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.270020    3827 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:05:22.270145    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:22.270198    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.283427    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.283451    3827 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:05:22.283523    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.296364    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.296384    3827 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:05:22.296395    3827 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:05:22.296485    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:22.296554    3827 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:05:22.333611    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:22.333625    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:22.333642    3827 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:05:22.333657    3827 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:05:22.333735    3827 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:05:22.333754    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:22.333805    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:22.346453    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:22.346520    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:22.346575    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:22.354547    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:22.354585    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:05:22.361938    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:05:22.375252    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:22.388755    3827 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:05:22.402335    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:22.415747    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:22.418701    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.428772    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.517473    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:22.532209    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:05:22.532222    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:22.532233    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:22.532416    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:22.532495    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:22.532505    3827 certs.go:256] generating profile certs ...
	I0731 10:05:22.532617    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:22.532703    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:05:22.532784    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:22.532791    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:22.532813    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:22.532832    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:22.532850    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:22.532866    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:22.532896    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:22.532925    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:22.532949    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:22.533054    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:22.533101    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:22.533110    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:22.533142    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:22.533177    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:22.533206    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:22.533274    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:22.533306    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.533327    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.533344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.533765    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:22.562933    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:22.585645    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:22.608214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:22.634417    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:22.664309    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:22.693214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:22.749172    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:22.798119    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:22.837848    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:22.862351    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:22.887141    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:05:22.900789    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:22.904988    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:22.914154    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917542    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917577    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.921712    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:22.930986    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:22.940208    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943536    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943573    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.947845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:22.957024    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:22.965988    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969319    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969351    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.973794    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:22.982944    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:22.986290    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:22.990544    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:22.994707    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:22.999035    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:23.003364    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:23.007486    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:23.011657    3827 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:23.011769    3827 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:05:23.024287    3827 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:05:23.032627    3827 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:05:23.032639    3827 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:05:23.032681    3827 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:05:23.040731    3827 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:05:23.041056    3827 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.041141    3827 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:05:23.041332    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.041968    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.042168    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:05:23.042482    3827 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:05:23.042638    3827 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:05:23.050561    3827 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:05:23.050575    3827 kubeadm.go:597] duration metric: took 17.931942ms to restartPrimaryControlPlane
	I0731 10:05:23.050580    3827 kubeadm.go:394] duration metric: took 38.928464ms to StartCluster
	I0731 10:05:23.050588    3827 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.050661    3827 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.051035    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.051268    3827 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:23.051280    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:05:23.051290    3827 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:05:23.051393    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.095938    3827 out.go:177] * Enabled addons: 
	I0731 10:05:23.116914    3827 addons.go:510] duration metric: took 65.60253ms for enable addons: enabled=[]
	I0731 10:05:23.116954    3827 start.go:246] waiting for cluster config update ...
	I0731 10:05:23.116965    3827 start.go:255] writing updated cluster config ...
	I0731 10:05:23.138605    3827 out.go:177] 
	I0731 10:05:23.160466    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.160597    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.182983    3827 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:05:23.224869    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:23.224904    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:23.225104    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:23.225125    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:23.225250    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.226256    3827 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:23.226360    3827 start.go:364] duration metric: took 80.549µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:05:23.226385    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:23.226394    3827 fix.go:54] fixHost starting: m02
	I0731 10:05:23.226804    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:23.226838    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:23.236394    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52012
	I0731 10:05:23.236756    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:23.237106    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:23.237125    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:23.237342    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:23.237473    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.237574    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:05:23.237669    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.237738    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:05:23.238671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.238732    3827 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:05:23.238750    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:05:23.238834    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:23.260015    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:05:23.302032    3827 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:05:23.302368    3827 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:05:23.302393    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.304220    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.304235    3827 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3703 is in state "Stopped"
	I0731 10:05:23.304257    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:05:23.304590    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:05:23.331752    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:05:23.331774    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:23.331901    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331928    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331992    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:23.332030    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:23.332051    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:23.333566    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Pid is 3849
	I0731 10:05:23.333951    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:05:23.333966    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.334032    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3849
	I0731 10:05:23.335680    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:05:23.335745    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:23.335779    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:05:23.335790    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbf52}
	I0731 10:05:23.335796    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:05:23.335803    3827 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:05:23.335842    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:05:23.336526    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:23.336703    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.337199    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:23.337210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.337324    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:23.337431    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:23.337536    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337761    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:23.337898    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:23.338051    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:23.338058    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:23.341501    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:23.350236    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:23.351301    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.351321    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.351333    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.351364    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.736116    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:23.736132    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:23.851173    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.851191    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.851204    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.851217    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.852083    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:23.852399    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:29.408102    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:29.408171    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:29.408180    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:29.431671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:34.400446    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:34.400461    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400584    3827 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:05:34.400595    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400705    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.400796    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.400890    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.400963    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.401039    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.401181    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.401327    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.401336    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:05:34.470038    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:05:34.470053    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.470199    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.470327    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470407    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470489    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.470615    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.470762    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.470773    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:34.535872    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:34.535890    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:34.535899    3827 buildroot.go:174] setting up certificates
	I0731 10:05:34.535905    3827 provision.go:84] configureAuth start
	I0731 10:05:34.535911    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.536042    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:34.536141    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.536239    3827 provision.go:143] copyHostCerts
	I0731 10:05:34.536274    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536323    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:34.536328    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536441    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:34.536669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536701    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:34.536706    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536812    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:34.536958    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.536987    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:34.536992    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.537061    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:34.537222    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:05:34.648982    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:34.649040    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:34.649057    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.649198    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.649295    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.649402    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.649489    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:34.683701    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:34.683772    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:34.703525    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:34.703596    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:34.722548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:34.722624    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:05:34.742309    3827 provision.go:87] duration metric: took 206.391288ms to configureAuth
	I0731 10:05:34.742322    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:34.742483    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:34.742496    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:34.742630    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.742723    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.742814    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742903    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742982    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.743099    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.743260    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.743269    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:34.800092    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:34.800106    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:34.800191    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:34.800203    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.800330    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.800415    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800506    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800591    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.800702    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.800838    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.800885    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:34.869190    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:34.869210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.869342    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.869439    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869544    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869626    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.869780    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.869920    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.869935    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:36.520454    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:36.520469    3827 machine.go:97] duration metric: took 13.183263325s to provisionDockerMachine
	I0731 10:05:36.520479    3827 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:05:36.520499    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:36.520508    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.520691    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:36.520702    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.520789    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.520884    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.520979    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.521066    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.561300    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:36.564926    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:36.564938    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:36.565027    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:36.565170    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:36.565176    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:36.565342    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:36.574123    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:36.603284    3827 start.go:296] duration metric: took 82.788869ms for postStartSetup
	I0731 10:05:36.603307    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.603494    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:36.603509    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.603613    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.603706    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.603803    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.603903    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.639240    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:36.639297    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:36.692559    3827 fix.go:56] duration metric: took 13.466165097s for fixHost
	I0731 10:05:36.692585    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.692728    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.692817    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692901    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692991    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.693111    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:36.693255    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:36.693263    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:05:36.752606    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445536.868457526
	
	I0731 10:05:36.752619    3827 fix.go:216] guest clock: 1722445536.868457526
	I0731 10:05:36.752626    3827 fix.go:229] Guest: 2024-07-31 10:05:36.868457526 -0700 PDT Remote: 2024-07-31 10:05:36.692574 -0700 PDT m=+34.358830009 (delta=175.883526ms)
	I0731 10:05:36.752636    3827 fix.go:200] guest clock delta is within tolerance: 175.883526ms
	I0731 10:05:36.752640    3827 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.526270601s
	I0731 10:05:36.752657    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.752793    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:36.777379    3827 out.go:177] * Found network options:
	I0731 10:05:36.798039    3827 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:05:36.819503    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.819540    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820385    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820770    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:36.820818    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:05:36.820878    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.820996    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:05:36.821009    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821024    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.821247    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821250    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821474    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821525    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821664    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.821739    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821918    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:05:36.854335    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:36.854406    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:36.901302    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:36.901324    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:36.901422    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:36.917770    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:36.926621    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:36.935218    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:36.935259    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:36.943879    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.952873    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:36.961710    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.970281    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:36.979176    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:36.987922    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:36.996548    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:37.005349    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:37.013281    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:37.020977    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.118458    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:37.137862    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:37.137937    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:37.153588    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.167668    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:37.181903    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.192106    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.202268    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:37.223314    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.233629    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:37.248658    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:37.251547    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:37.258758    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:37.272146    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:37.371218    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:37.472623    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:37.472648    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:37.486639    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.587113    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:39.947283    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.360151257s)
	I0731 10:05:39.947347    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:39.958391    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:39.972060    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:39.983040    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:40.085475    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:40.202062    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.302654    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:40.316209    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:40.326252    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.418074    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:40.482758    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:40.482836    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:40.487561    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:40.487613    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:40.491035    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:40.518347    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:40.518420    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.537051    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.576384    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:40.597853    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:05:40.618716    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:40.618993    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:40.622501    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:40.631917    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:05:40.632085    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:40.632302    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.632324    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.640887    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52034
	I0731 10:05:40.641227    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.641546    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.641557    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.641784    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.641900    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:40.641993    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:40.642069    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:40.643035    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:05:40.643318    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.643340    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.651868    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52036
	I0731 10:05:40.652209    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.652562    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.652581    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.652781    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.652890    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:40.652982    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 10:05:40.652988    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:40.653003    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:40.653135    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:40.653190    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:40.653199    3827 certs.go:256] generating profile certs ...
	I0731 10:05:40.653301    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:40.653388    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.59c17652
	I0731 10:05:40.653436    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:40.653443    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:40.653468    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:40.653489    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:40.653510    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:40.653529    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:40.653548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:40.653566    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:40.653584    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:40.653667    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:40.653713    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:40.653722    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:40.653755    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:40.653790    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:40.653819    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:40.653897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:40.653931    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:40.653957    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:40.653976    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:40.654001    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:40.654103    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:40.654205    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:40.654295    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:40.654382    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:40.686134    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 10:05:40.689771    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:05:40.697866    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 10:05:40.700957    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:05:40.708798    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:05:40.711973    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:05:40.719794    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:05:40.722937    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:05:40.731558    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:05:40.734708    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:05:40.742535    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 10:05:40.745692    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:05:40.753969    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:40.774721    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:40.793621    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:40.813481    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:40.833191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:40.853099    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:40.872942    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:40.892952    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:40.912690    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:40.932438    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:40.952459    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:40.971059    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:05:40.984708    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:05:40.998235    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:05:41.011745    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:05:41.025144    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:05:41.038794    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:05:41.052449    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:05:41.066415    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:41.070679    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:41.078894    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082206    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082237    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.086362    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:41.094634    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:41.103040    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106511    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106559    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.110939    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:41.119202    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:41.127421    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130783    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.134845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:41.142958    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:41.146291    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:41.150662    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:41.154843    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:41.159061    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:41.163240    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:41.167541    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:41.171729    3827 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 10:05:41.171784    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:41.171806    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:41.171838    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:41.184093    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:41.184125    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:41.184181    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:41.191780    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:41.191825    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:05:41.199155    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:05:41.212419    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:41.225964    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:41.239859    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:41.242661    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:41.251855    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.345266    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.360525    3827 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:41.360751    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:41.382214    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:05:41.402932    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.525126    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.539502    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:41.539699    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:05:41.539742    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:05:41.539934    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:41.540009    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:41.540015    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:41.540022    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:41.540026    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.017427    3827 round_trippers.go:574] Response Status: 200 OK in 8477 milliseconds
	I0731 10:05:50.018648    3827 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 10:05:50.018662    3827 node_ready.go:38] duration metric: took 8.478709659s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:50.018668    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:05:50.018717    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:05:50.018723    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.018731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.018737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.028704    3827 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 10:05:50.043501    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.043562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:05:50.043568    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.043574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.043579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.049258    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.050015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.050025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.050031    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.050035    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.066794    3827 round_trippers.go:574] Response Status: 200 OK in 16 milliseconds
	I0731 10:05:50.067093    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.067103    3827 pod_ready.go:81] duration metric: took 23.584491ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067110    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067150    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:05:50.067155    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.067161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.067170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.072229    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.072653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.072662    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.072674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.072678    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076158    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.076475    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.076487    3827 pod_ready.go:81] duration metric: took 9.372147ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076494    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076536    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:05:50.076541    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.076547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076551    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079467    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.079849    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.079858    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.079866    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079871    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.086323    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:05:50.086764    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.086775    3827 pod_ready.go:81] duration metric: took 10.276448ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086782    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:05:50.086846    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.086852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.086861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.090747    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.091293    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:50.091301    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.091306    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.091310    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.093538    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.094155    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.094165    3827 pod_ready.go:81] duration metric: took 7.376399ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094171    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:05:50.094214    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.094220    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.094223    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.096892    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.219826    3827 request.go:629] Waited for 122.388601ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219867    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219876    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.219882    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.219887    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.222303    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.222701    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.222710    3827 pod_ready.go:81] duration metric: took 128.533092ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.222720    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.419341    3827 request.go:629] Waited for 196.517978ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419372    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419376    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.419382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.419386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.424561    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.619242    3827 request.go:629] Waited for 194.143472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619333    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619339    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.619346    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.619350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.622245    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.622550    3827 pod_ready.go:97] node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622563    3827 pod_ready.go:81] duration metric: took 399.836525ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	E0731 10:05:50.622570    3827 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622575    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.819353    3827 request.go:629] Waited for 196.739442ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819433    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.819438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.819447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.822809    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:51.019387    3827 request.go:629] Waited for 196.0195ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019480    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.019488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.019494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.021643    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.220184    3827 request.go:629] Waited for 96.247837ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220254    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220260    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.220266    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.220271    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.222468    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.419702    3827 request.go:629] Waited for 196.732028ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419735    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419739    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.419746    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.419749    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.422018    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.622851    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.622865    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.622870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.622873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.625570    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.818923    3827 request.go:629] Waited for 192.647007ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818971    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.818977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.818981    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.821253    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.123108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.123124    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.123133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.123137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.125336    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.220188    3827 request.go:629] Waited for 94.282602ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220295    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220306    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.220317    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.220325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.223136    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.623123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.623202    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.623217    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.623227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.626259    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:52.626893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.626903    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.626912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.626916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.628416    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:52.628799    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:53.124413    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.124432    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.124441    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.124446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.127045    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.127494    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.127501    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.127511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.127514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.129223    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:53.623065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.623121    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.623133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.623142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626047    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.626707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.626717    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.626725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626729    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.628447    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:54.123646    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.123761    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.123778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.123788    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.127286    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:54.128015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.128025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.128033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.128038    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.130101    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.623229    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.623244    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.623253    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.623266    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.625325    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.625780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.625788    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.625794    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.625798    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.627218    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.123298    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.123318    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.123329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.123334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.126495    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:55.127199    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.127207    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.127213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.127217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.128585    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.128968    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:55.623994    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.624008    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.624016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.624021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.626813    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:55.627329    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.627336    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.627342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.627345    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.628805    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.123118    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.123195    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.123210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.123231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.126276    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:56.126864    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.126872    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.126877    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.126881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.128479    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.623814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.623924    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.623942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.623953    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.626841    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:56.627450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.627457    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.627463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.627467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.628844    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.124173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.124250    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.124262    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.124287    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.127734    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:57.128370    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.128377    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.128383    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.128386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.130108    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.130481    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:57.624004    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.624033    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.624093    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.624103    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.627095    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:57.628522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.628533    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.628541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.628547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.630446    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.123493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.123505    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.123512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.123514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.125506    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.126108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.126116    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.126121    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.126124    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.127991    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.623114    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.623141    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.623216    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.626428    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:58.627173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.627181    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.627187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.627191    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.628749    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.123212    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.123231    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.123243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.123249    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.126584    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:59.127100    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.127110    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.127118    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.127123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.129080    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.624707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.624736    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.624808    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.624814    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.627710    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:59.628543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.628550    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.628556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.628560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.630077    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.630437    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:00.123863    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.123878    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.123885    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.123888    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.125761    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.126237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.126245    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.126251    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.126254    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.127937    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.623226    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.623240    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.623246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.623249    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625210    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.625691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.625699    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.625704    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.627280    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.124705    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.124804    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.124820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.124830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.127445    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.127933    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.127941    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.127947    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.127950    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.129462    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.623718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.623731    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.623736    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.623739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.625948    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.626336    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.626344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.626349    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.626352    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.627901    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.124021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.124081    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.124088    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.124092    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.125801    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.126187    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.126195    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.126200    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.126204    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.127656    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.127974    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:02.623206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.623222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.623232    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.626774    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:02.627381    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.627389    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.627395    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.627400    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.630037    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.122889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.122980    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.122991    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.122997    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.125539    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.125964    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.125972    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.125976    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.125991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.129847    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.623340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.623368    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.623379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.623386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.626892    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.627517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.627524    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.627530    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.627532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.629281    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.123967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:04.124007    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.124016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.124021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.126604    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.127104    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.127111    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.127116    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.127131    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.128806    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.129260    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.129268    3827 pod_ready.go:81] duration metric: took 13.506690115s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129277    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129312    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:04.129317    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.129323    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.129328    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.131506    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.131966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.131974    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.131980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.131984    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.133464    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.133963    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.133974    3827 pod_ready.go:81] duration metric: took 4.690553ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.133981    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.134013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:04.134018    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.134023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.134028    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.136093    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.136498    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:04.136506    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.136512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.136515    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.138480    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.138864    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.138874    3827 pod_ready.go:81] duration metric: took 4.887644ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138882    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138917    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:04.138922    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.138928    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.138932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.140760    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.141121    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.141129    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.141134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.141137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.143127    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.143455    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.143464    3827 pod_ready.go:81] duration metric: took 4.577275ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143471    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:04.143513    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.143519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.143523    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.145638    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.145987    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.145994    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.146000    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.146003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.147718    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.148046    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.148055    3827 pod_ready.go:81] duration metric: took 4.578507ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.148061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.325414    3827 request.go:629] Waited for 177.298505ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325544    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.325555    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.325563    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.328825    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.525753    3827 request.go:629] Waited for 196.338568ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.525828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.525836    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.529114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.529604    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.529616    3827 pod_ready.go:81] duration metric: took 381.550005ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.529625    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.724886    3827 request.go:629] Waited for 195.165832ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724931    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.724937    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.724942    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.726934    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.924942    3827 request.go:629] Waited for 197.623557ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924972    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924977    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.924984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.924987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.927056    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.927556    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.927566    3827 pod_ready.go:81] duration metric: took 397.934888ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.927572    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.124719    3827 request.go:629] Waited for 197.081968ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124767    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.124774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.124777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.126705    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.324036    3827 request.go:629] Waited for 196.854241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324136    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.324144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.324151    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.326450    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:05.326831    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.326840    3827 pod_ready.go:81] duration metric: took 399.263993ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.326854    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.525444    3827 request.go:629] Waited for 198.543186ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525484    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.525490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.525494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.527459    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.724382    3827 request.go:629] Waited for 196.465154ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724505    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.724516    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.724528    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.727650    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:05.728134    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.728147    3827 pod_ready.go:81] duration metric: took 401.285988ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.728155    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.925067    3827 request.go:629] Waited for 196.808438ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.925137    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.925147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.928198    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.125772    3827 request.go:629] Waited for 196.79397ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125895    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125907    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.125918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.125924    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.129114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.129535    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.129548    3827 pod_ready.go:81] duration metric: took 401.386083ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.129557    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.324601    3827 request.go:629] Waited for 194.995432ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.324729    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.324736    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.327699    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.524056    3827 request.go:629] Waited for 195.918056ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524164    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524175    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.524186    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.524192    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.527800    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.528245    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.528255    3827 pod_ready.go:81] duration metric: took 398.692914ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.528262    3827 pod_ready.go:38] duration metric: took 16.509588377s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:06.528282    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:06.528341    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:06.541572    3827 api_server.go:72] duration metric: took 25.181024878s to wait for apiserver process to appear ...
	I0731 10:06:06.541584    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:06.541605    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:06.544968    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:06.545011    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:06.545016    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.545023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.545027    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.545730    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:06.545799    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:06.545808    3827 api_server.go:131] duration metric: took 4.219553ms to wait for apiserver health ...
	I0731 10:06:06.545813    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:06.724899    3827 request.go:629] Waited for 179.053526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724936    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.724948    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.724951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.733411    3827 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 10:06:06.742910    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:06.742937    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:06.742945    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:06.742950    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:06.742953    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:06.742958    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:06.742961    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:06.742963    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:06.742966    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:06.742968    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:06.742971    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:06.742973    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:06.742977    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:06.742981    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:06.742984    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:06.742986    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:06.742989    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:06.742991    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:06.742995    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:06.742998    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:06.743001    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:06.743003    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Pending
	I0731 10:06:06.743006    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:06.743010    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:06.743012    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:06.743017    3827 system_pods.go:74] duration metric: took 197.200154ms to wait for pod list to return data ...
	I0731 10:06:06.743023    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:06.925020    3827 request.go:629] Waited for 181.949734ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925067    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.925076    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.925081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.927535    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.927730    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:06.927740    3827 default_sa.go:55] duration metric: took 184.712762ms for default service account to be created ...
	I0731 10:06:06.927745    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:07.125051    3827 request.go:629] Waited for 197.272072ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125090    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.125100    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.125104    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.129975    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:07.134630    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:07.134648    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134654    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134659    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:07.134663    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:07.134666    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:07.134671    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running / Ready:ContainersNotReady (containers with unready status: [kindnet-cni]) / ContainersReady:ContainersNotReady (containers with unready status: [kindnet-cni])
	I0731 10:06:07.134675    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:07.134679    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:07.134683    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:07.134705    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:07.134712    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:07.134718    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0731 10:06:07.134723    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:07.134728    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:07.134731    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:07.134735    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:07.134739    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0731 10:06:07.134743    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:07.134747    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:07.134751    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:07.134755    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:07.134764    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:07.134768    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:07.134772    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0731 10:06:07.134781    3827 system_pods.go:126] duration metric: took 207.030567ms to wait for k8s-apps to be running ...
	I0731 10:06:07.134786    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:07.134841    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:07.148198    3827 system_svc.go:56] duration metric: took 13.406485ms WaitForService to wait for kubelet
	I0731 10:06:07.148215    3827 kubeadm.go:582] duration metric: took 25.78766951s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:07.148230    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:07.324197    3827 request.go:629] Waited for 175.905806ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324232    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.324238    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.324243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.329946    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:07.330815    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330830    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330840    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330843    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330847    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330850    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330853    3827 node_conditions.go:105] duration metric: took 182.619551ms to run NodePressure ...
	I0731 10:06:07.330860    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:07.330878    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:07.352309    3827 out.go:177] 
	I0731 10:06:07.373528    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:07.373631    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.433500    3827 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 10:06:07.475236    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:07.475262    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:07.475398    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:07.475412    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:07.475498    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.476024    3827 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:07.476077    3827 start.go:364] duration metric: took 40.57µs to acquireMachinesLock for "ha-393000-m03"
	I0731 10:06:07.476090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:07.476095    3827 fix.go:54] fixHost starting: m03
	I0731 10:06:07.476337    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:07.476357    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:07.485700    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52041
	I0731 10:06:07.486069    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:07.486427    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:07.486449    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:07.486677    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:07.486797    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.486888    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:06:07.486969    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.487057    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:06:07.488010    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.488031    3827 fix.go:112] recreateIfNeeded on ha-393000-m03: state=Stopped err=<nil>
	I0731 10:06:07.488039    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	W0731 10:06:07.488129    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:07.525270    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m03" ...
	I0731 10:06:07.583189    3827 main.go:141] libmachine: (ha-393000-m03) Calling .Start
	I0731 10:06:07.583357    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.583398    3827 main.go:141] libmachine: (ha-393000-m03) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 10:06:07.584444    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.584457    3827 main.go:141] libmachine: (ha-393000-m03) DBG | pid 2994 is in state "Stopped"
	I0731 10:06:07.584473    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid...
	I0731 10:06:07.584622    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 10:06:07.614491    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 10:06:07.614519    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:07.614662    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614709    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614792    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:07.614841    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:07.614865    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:07.616508    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Pid is 3858
	I0731 10:06:07.617000    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 10:06:07.617017    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.617185    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 3858
	I0731 10:06:07.619558    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 10:06:07.619621    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:07.619647    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:07.619664    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:07.619685    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:07.619703    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:06:07.619712    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 10:06:07.619727    3827 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 10:06:07.619755    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 10:06:07.620809    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:07.621055    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.621590    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:07.621602    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.621745    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:07.621861    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:07.621957    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622061    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622150    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:07.622290    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:07.622460    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:07.622469    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:07.625744    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:07.635188    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:07.636453    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:07.636476    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:07.636488    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:07.636503    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.026194    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:08.026210    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:08.141380    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:08.141403    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:08.141420    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:08.141430    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.142228    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:08.142237    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:13.717443    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:13.717596    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:13.717612    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:13.741129    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:06:18.682578    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:06:18.682599    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682767    3827 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 10:06:18.682779    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682866    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.682981    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.683070    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683166    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683267    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.683412    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.683571    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.683581    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 10:06:18.749045    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 10:06:18.749064    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.749190    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.749278    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749369    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.749565    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.749706    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.749722    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:06:18.806865    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:06:18.806883    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:06:18.806892    3827 buildroot.go:174] setting up certificates
	I0731 10:06:18.806898    3827 provision.go:84] configureAuth start
	I0731 10:06:18.806904    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.807035    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:18.807129    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.807209    3827 provision.go:143] copyHostCerts
	I0731 10:06:18.807236    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807287    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:06:18.807293    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807440    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:06:18.807654    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807687    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:06:18.807691    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807798    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:06:18.807946    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.807978    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:06:18.807983    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.808051    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:06:18.808199    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 10:06:18.849388    3827 provision.go:177] copyRemoteCerts
	I0731 10:06:18.849440    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:06:18.849454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.849608    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.849706    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.849793    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.849878    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:18.882927    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:06:18.883001    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:06:18.902836    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:06:18.902904    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:06:18.922711    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:06:18.922778    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:06:18.943709    3827 provision.go:87] duration metric: took 136.803232ms to configureAuth
	I0731 10:06:18.943724    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:06:18.943896    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:18.943910    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:18.944075    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.944168    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.944245    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944342    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944422    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.944538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.944665    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.944672    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:06:18.996744    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:06:18.996756    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:06:18.996829    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:06:18.996840    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.996972    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.997082    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997171    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997252    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.997394    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.997538    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.997587    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:06:19.061774    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:06:19.061792    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:19.061924    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:19.062001    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062094    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062183    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:19.062322    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:19.062475    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:19.062487    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:06:20.667693    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:06:20.667709    3827 machine.go:97] duration metric: took 13.046112735s to provisionDockerMachine
	I0731 10:06:20.667718    3827 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 10:06:20.667725    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:06:20.667738    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.667939    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:06:20.667954    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.668063    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.668167    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.668260    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.668365    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.711043    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:06:20.714520    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:06:20.714533    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:06:20.714632    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:06:20.714782    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:06:20.714789    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:06:20.714971    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:06:20.725237    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:20.756197    3827 start.go:296] duration metric: took 88.463878ms for postStartSetup
	I0731 10:06:20.756221    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.756402    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:06:20.756417    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.756509    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.756594    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.756688    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.756757    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.788829    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:06:20.788889    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:06:20.841715    3827 fix.go:56] duration metric: took 13.365618842s for fixHost
	I0731 10:06:20.841743    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.841878    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.841982    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842069    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842155    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.842314    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:20.842486    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:20.842494    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:06:20.895743    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445580.896263750
	
	I0731 10:06:20.895763    3827 fix.go:216] guest clock: 1722445580.896263750
	I0731 10:06:20.895768    3827 fix.go:229] Guest: 2024-07-31 10:06:20.89626375 -0700 PDT Remote: 2024-07-31 10:06:20.841731 -0700 PDT m=+78.507993684 (delta=54.53275ms)
	I0731 10:06:20.895779    3827 fix.go:200] guest clock delta is within tolerance: 54.53275ms
	I0731 10:06:20.895783    3827 start.go:83] releasing machines lock for "ha-393000-m03", held for 13.419701289s
	I0731 10:06:20.895800    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.895930    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:20.933794    3827 out.go:177] * Found network options:
	I0731 10:06:21.008361    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 10:06:21.029193    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.029220    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.029239    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.029902    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030149    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030274    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:06:21.030303    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 10:06:21.030372    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.030402    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.030458    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030487    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:06:21.030508    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:21.030615    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030657    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030724    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030782    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030837    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030887    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:21.030941    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 10:06:21.060481    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:06:21.060548    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:06:21.113024    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:06:21.113039    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.113103    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.128523    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:06:21.136837    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:06:21.145325    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.145388    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:06:21.153686    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.162021    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:06:21.170104    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.178345    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:06:21.186720    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:06:21.195003    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:06:21.203212    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:06:21.211700    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:06:21.219303    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:06:21.226730    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.333036    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:06:21.355400    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.355468    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:06:21.370793    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.382599    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:06:21.397116    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.408366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.419500    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:06:21.441593    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.453210    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.468638    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:06:21.471686    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:06:21.480107    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:06:21.493473    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:06:21.590098    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:06:21.695002    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.695025    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:06:21.709644    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.804799    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:06:24.090859    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.286034061s)
	I0731 10:06:24.090921    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:06:24.102085    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:06:24.115631    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.125950    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:06:24.222193    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:06:24.332843    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.449689    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:06:24.463232    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.474652    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.567486    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:06:24.631150    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:06:24.631230    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:06:24.635708    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:06:24.635764    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:06:24.638929    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:06:24.666470    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:06:24.666542    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.686587    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.729344    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:06:24.771251    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:06:24.792172    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:06:24.813314    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:24.813703    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:06:24.818215    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:24.828147    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:06:24.828324    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:24.828531    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.828552    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.837259    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52063
	I0731 10:06:24.837609    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.837954    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.837967    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.838165    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.838272    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:06:24.838349    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:24.838424    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:06:24.839404    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:06:24.839647    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.839672    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.848293    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52065
	I0731 10:06:24.848630    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.848982    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.848999    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.849191    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.849297    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:06:24.849393    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 10:06:24.849399    3827 certs.go:194] generating shared ca certs ...
	I0731 10:06:24.849408    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:06:24.849551    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:06:24.849606    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:06:24.849615    3827 certs.go:256] generating profile certs ...
	I0731 10:06:24.849710    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:06:24.849799    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 10:06:24.849848    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:06:24.849860    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:06:24.849881    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:06:24.849901    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:06:24.849920    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:06:24.849937    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:06:24.849955    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:06:24.849974    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:06:24.849991    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:06:24.850072    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:06:24.850109    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:06:24.850118    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:06:24.850152    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:06:24.850184    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:06:24.850218    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:06:24.850285    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:24.850322    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:06:24.850344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:06:24.850366    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:24.850395    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:06:24.850485    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:06:24.850565    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:06:24.850653    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:06:24.850732    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:06:24.882529    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 10:06:24.886785    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:06:24.896598    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 10:06:24.900384    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:06:24.910269    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:06:24.914011    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:06:24.922532    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:06:24.925784    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:06:24.936850    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:06:24.940321    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:06:24.950026    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 10:06:24.953055    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:06:24.962295    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:06:24.982990    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:06:25.003016    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:06:25.022822    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:06:25.043864    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:06:25.064140    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:06:25.084546    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:06:25.105394    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:06:25.125890    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:06:25.146532    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:06:25.166742    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:06:25.186545    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:06:25.200206    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:06:25.214106    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:06:25.228037    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:06:25.242065    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:06:25.255847    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:06:25.269574    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:06:25.283881    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:06:25.288466    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:06:25.297630    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301289    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301331    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.305714    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:06:25.314348    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:06:25.322967    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326578    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326634    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.330926    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:06:25.339498    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:06:25.348151    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351535    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351576    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.355921    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:06:25.364535    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:06:25.368077    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:06:25.372428    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:06:25.376757    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:06:25.380980    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:06:25.385296    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:06:25.389606    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:06:25.393857    3827 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 10:06:25.393914    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:06:25.393928    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:06:25.393959    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:06:25.405786    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:06:25.405830    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:06:25.405888    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:06:25.414334    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:06:25.414379    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:06:25.422310    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:06:25.435970    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:06:25.449652    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:06:25.463392    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:06:25.466266    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:25.476391    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.572265    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.587266    3827 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:06:25.587454    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:25.609105    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:06:25.650600    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.776520    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.790838    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:06:25.791048    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:06:25.791095    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:06:25.791257    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.791299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:25.791305    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.791311    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.791315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.793351    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.793683    3827 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 10:06:25.793693    3827 node_ready.go:38] duration metric: took 2.426331ms for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.793700    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:25.793737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:25.793742    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.793753    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.793758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.797877    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:25.803934    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:25.803995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:25.804000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.804007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.804011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.806477    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.806997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:25.807005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.807011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.807014    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.808989    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.304983    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.304998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.305006    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.305010    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.307209    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:26.307839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.307846    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.307852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.307861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.309644    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.805493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.805510    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.805520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.805527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.821394    3827 round_trippers.go:574] Response Status: 200 OK in 15 milliseconds
	I0731 10:06:26.822205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.822215    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.822221    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.822224    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.827160    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:27.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.305839    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.305846    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308258    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.308744    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.308752    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.308758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308761    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.310974    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.805552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.805567    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.805574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.805578    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.807860    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.808403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.808410    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.808416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.808419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.810436    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.810811    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:28.305577    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.305593    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.305600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.305604    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.311583    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:28.312446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.312455    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.312461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.312465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.314779    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.804391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.804407    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.804414    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.804420    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.806848    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.807227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.807235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.807241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.807244    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.809171    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:29.305552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.305615    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.305624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.305629    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.308134    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.308891    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.308900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.308906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.308909    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.311098    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.805109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.805127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.805192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.805198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.807898    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.808285    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.808292    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.808297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.808300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.810154    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.305017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.305032    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.305045    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.305048    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.307205    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.307776    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.307783    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.307789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.307792    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.309771    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.310293    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:30.805366    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.805428    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.805436    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.805440    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.807864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.808309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.808316    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.808322    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.808325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.810111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.305667    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.305700    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.305708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.305712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308126    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:31.308539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.308546    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.308552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.310279    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.804975    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.805002    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.805014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.805020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.808534    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:31.809053    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.809061    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.809066    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.809069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.810955    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.304759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.304815    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.304830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.304839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.308267    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.308684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.308692    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.308698    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.308701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.310475    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.310804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:32.805138    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.805163    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.805175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.805181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.808419    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.809125    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.809133    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.809139    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.809143    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.810741    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.305088    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.305103    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.305109    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.305113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.307495    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.307998    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.308005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.308011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.308015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.309595    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.806000    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.806021    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.806049    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.806056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.808625    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.809248    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.809259    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.809264    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.809269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.810758    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.305752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.305832    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.305847    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.305853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.308868    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.309591    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.309599    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.309605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.309608    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.311263    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.311627    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:34.804923    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.804948    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.804959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.804965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.808036    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.808636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.808646    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.808654    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.808670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.810398    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.305879    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.305966    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.305982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.305991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.309016    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:35.309584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.309592    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.309598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.309601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.311155    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.804092    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.804107    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.804114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.804117    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.806476    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:35.806988    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.806997    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.807002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.807025    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.808897    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.305921    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.305943    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.305951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.305955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.308670    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:36.309170    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.309178    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.309184    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.309199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.310943    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.805015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.805085    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.805098    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.805106    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.808215    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:36.808810    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.808817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.808823    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.808827    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.810482    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.810768    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:37.305031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.305055    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.305068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.305077    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.308209    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:37.308934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.308942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.308947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.308951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.310514    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:37.805625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.805671    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.805682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.805687    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808188    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:37.808728    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.808735    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.808741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808744    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.810288    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.305838    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.305845    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.307926    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.308378    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.308386    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.308391    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.308395    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.310092    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.805380    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.805397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.805406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.805410    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.807819    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.808368    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.808376    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.808382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.808385    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.809904    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.305804    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.305820    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.305826    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.305830    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.307991    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.308527    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.308535    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.308541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.308546    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.310495    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.310929    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:39.806108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.806122    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.806129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.806132    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.808192    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.808709    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.808718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.808727    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.808730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.810476    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.304101    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.304125    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.304137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.304144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307004    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.307629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.307637    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.307643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.309373    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.804289    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.804302    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.804329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.804334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.806678    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.807320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.807328    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.807334    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.807338    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.809111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.305710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.305762    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.305770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.305774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.307795    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.308244    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.308252    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.308258    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.310033    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.805219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.805235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.805242    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.805246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.807574    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.808103    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.808112    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.808119    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.808123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.810305    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.810720    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:42.305509    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.305569    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.305580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.305586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.307774    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:42.308154    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.308161    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.308167    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.308170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.309895    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:42.804631    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.804655    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.804667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.804687    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.808080    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:42.808852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.808863    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.808869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.808874    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.811059    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.304116    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.304217    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.304233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.304239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.306879    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.307340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.307348    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.307354    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.307358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.308948    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.805920    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.805934    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.805981    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.805986    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.808009    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.808576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.808583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.808589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.808592    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.810282    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.810804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:44.304703    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.304728    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.304798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.304823    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.308376    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.308780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.308787    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.308793    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.308797    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.310396    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:44.805218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.805242    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.805255    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.805264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.808404    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.808967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.808978    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.808986    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.808990    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.810748    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.304672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.304770    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.304784    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.304791    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.307754    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:45.308249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.308256    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.308265    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.309903    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.804236    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.804265    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.804276    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.804281    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.807605    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:45.808214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.808222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.808228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.808231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.810076    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.305660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.305674    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.305723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.305727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.307959    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.308389    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.308397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.308403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.308406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.310188    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.310668    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:46.805585    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.805685    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.805700    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.805708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.808399    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.808892    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.808900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.808910    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.808914    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.810397    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.304911    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:47.304926    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.304933    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.304936    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.307282    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.307761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.307768    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.307774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.307777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.309541    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.309921    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.309931    3827 pod_ready.go:81] duration metric: took 21.505983976s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309937    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:06:47.309971    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.309977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.309980    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.311547    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.311995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.312003    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.312009    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.312013    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.313414    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.313802    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.313811    3827 pod_ready.go:81] duration metric: took 3.869093ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313818    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313850    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:06:47.313855    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.313861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.313865    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.315523    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.315938    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.315947    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.315955    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.315959    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.317522    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.317922    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.317931    3827 pod_ready.go:81] duration metric: took 4.10711ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317937    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:06:47.317976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.317982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.317985    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319520    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.319893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:47.319900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.319906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319909    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321439    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.321816    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.321825    3827 pod_ready.go:81] duration metric: took 3.88293ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321832    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321862    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:06:47.321867    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.321872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321876    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.323407    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.323756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:47.323763    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.323769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.323773    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.325384    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.325703    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.325712    3827 pod_ready.go:81] duration metric: took 3.875112ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.325727    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.505410    3827 request.go:629] Waited for 179.649549ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505454    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.505462    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.505467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.508003    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.705861    3827 request.go:629] Waited for 197.38651ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.705987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.705997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.708863    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.709477    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.709486    3827 pod_ready.go:81] duration metric: took 383.754198ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.709493    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.905743    3827 request.go:629] Waited for 196.205437ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905783    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905790    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.905812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.905826    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.908144    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.106945    3827 request.go:629] Waited for 198.217758ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106991    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.107017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.107023    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.109503    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.109889    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.109898    3827 pod_ready.go:81] duration metric: took 400.399458ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.109910    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.306502    3827 request.go:629] Waited for 196.553294ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.306589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.306593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.308907    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.506077    3827 request.go:629] Waited for 196.82354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506180    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.506189    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.506195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.508341    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.508805    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.508814    3827 pod_ready.go:81] duration metric: took 398.898513ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.508829    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.706656    3827 request.go:629] Waited for 197.780207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706753    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706765    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.706776    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.706784    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.709960    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.906621    3827 request.go:629] Waited for 195.987746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906714    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906726    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.906737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.906744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.910100    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.910537    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.910550    3827 pod_ready.go:81] duration metric: took 401.715473ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.910559    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.106125    3827 request.go:629] Waited for 195.518023ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106250    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106262    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.106273    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.106280    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.109411    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.306599    3827 request.go:629] Waited for 196.360989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306730    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.306741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.306747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.309953    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.310311    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.310320    3827 pod_ready.go:81] duration metric: took 399.753992ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.310327    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.505092    3827 request.go:629] Waited for 194.718659ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505129    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.505140    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.505144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.510347    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:49.706499    3827 request.go:629] Waited for 195.722594ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706547    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706556    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.706623    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.706634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.709639    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:49.710039    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.710049    3827 pod_ready.go:81] duration metric: took 399.716837ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.710061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.906378    3827 request.go:629] Waited for 196.280735ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906418    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.906425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.906442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.911634    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:50.106586    3827 request.go:629] Waited for 194.536585ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106637    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106652    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.106717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.106725    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.109661    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.110176    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.110189    3827 pod_ready.go:81] duration metric: took 400.121095ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.110197    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.306216    3827 request.go:629] Waited for 195.968962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306286    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.306291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.306301    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.308314    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.505180    3827 request.go:629] Waited for 196.336434ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505332    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.505344    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.505351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.508601    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.509059    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.509072    3827 pod_ready.go:81] duration metric: took 398.868353ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.509081    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.705014    3827 request.go:629] Waited for 195.886159ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.705144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.705151    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.708274    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.906912    3827 request.go:629] Waited for 198.179332ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906985    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906991    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.906997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.907002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.908938    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:50.909509    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.909519    3827 pod_ready.go:81] duration metric: took 400.431581ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.909525    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.106576    3827 request.go:629] Waited for 197.012349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106668    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.106677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.106682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.109021    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.305894    3827 request.go:629] Waited for 196.495089ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.305945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.306000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.306010    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.306018    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.308864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.309301    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.309311    3827 pod_ready.go:81] duration metric: took 399.779835ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.309324    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.504969    3827 request.go:629] Waited for 195.610894ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505066    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.505072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.505076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.507056    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:51.705447    3827 request.go:629] Waited for 197.942219ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705515    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.705522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.705527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.707999    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.708367    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.708379    3827 pod_ready.go:81] duration metric: took 399.049193ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.708391    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.906469    3827 request.go:629] Waited for 198.035792ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906531    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.906539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.906545    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.909082    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.105038    3827 request.go:629] Waited for 195.597271ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105087    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105095    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.105157    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.105168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.108049    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.108591    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:52.108604    3827 pod_ready.go:81] duration metric: took 400.204131ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:52.108615    3827 pod_ready.go:38] duration metric: took 26.314911332s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:52.108628    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:52.108680    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:52.120989    3827 api_server.go:72] duration metric: took 26.533695803s to wait for apiserver process to appear ...
	I0731 10:06:52.121002    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:52.121014    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:52.124310    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:52.124340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:52.124344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.124353    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.124358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.124912    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:52.124978    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:52.124989    3827 api_server.go:131] duration metric: took 3.981645ms to wait for apiserver health ...
	I0731 10:06:52.124994    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:52.305762    3827 request.go:629] Waited for 180.72349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305845    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305853    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.305861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.305872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.310548    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:52.315274    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:52.315286    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.315289    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.315292    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.315295    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.315298    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.315301    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.315303    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.315306    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.315311    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.315313    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.315316    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.315319    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.315322    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.315327    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.315330    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.315333    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.315335    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.315338    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.315341    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.315343    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.315346    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.315348    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.315350    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.315353    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.315358    3827 system_pods.go:74] duration metric: took 190.3593ms to wait for pod list to return data ...
	I0731 10:06:52.315363    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:52.505103    3827 request.go:629] Waited for 189.702061ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505178    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505187    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.505195    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.505199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.507558    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.507636    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:52.507644    3827 default_sa.go:55] duration metric: took 192.276446ms for default service account to be created ...
	I0731 10:06:52.507666    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:52.705427    3827 request.go:629] Waited for 197.710286ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705497    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.705519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.705526    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.711904    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:06:52.716760    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:52.716772    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.716777    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.716780    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.716783    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.716787    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.716790    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.716794    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.716798    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.716801    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.716805    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.716809    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.716813    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.716816    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.716819    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.716823    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.716827    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.716830    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.716833    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.716836    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.716854    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.716860    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.716864    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.716867    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.716871    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.716876    3827 system_pods.go:126] duration metric: took 209.203713ms to wait for k8s-apps to be running ...
	I0731 10:06:52.716881    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:52.716936    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:52.731223    3827 system_svc.go:56] duration metric: took 14.33545ms WaitForService to wait for kubelet
	I0731 10:06:52.731240    3827 kubeadm.go:582] duration metric: took 27.143948309s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:52.731255    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:52.906178    3827 request.go:629] Waited for 174.879721ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906213    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906218    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.906257    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.906264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.908378    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.909014    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909025    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909032    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909035    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909039    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909041    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909045    3827 node_conditions.go:105] duration metric: took 177.780993ms to run NodePressure ...
	I0731 10:06:52.909053    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:52.909067    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:52.931184    3827 out.go:177] 
	I0731 10:06:52.952773    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:52.952858    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:52.974676    3827 out.go:177] * Starting "ha-393000-m04" worker node in "ha-393000" cluster
	I0731 10:06:53.016553    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:53.016583    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:53.016766    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:53.016784    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:53.016901    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.017869    3827 start.go:360] acquireMachinesLock for ha-393000-m04: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:53.017982    3827 start.go:364] duration metric: took 90.107µs to acquireMachinesLock for "ha-393000-m04"
	I0731 10:06:53.018005    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:53.018013    3827 fix.go:54] fixHost starting: m04
	I0731 10:06:53.018399    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:53.018423    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:53.027659    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52069
	I0731 10:06:53.028033    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:53.028349    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:53.028359    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:53.028586    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:53.028695    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.028810    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:06:53.028891    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.028978    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 10:06:53.029947    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid 3095 missing from process table
	I0731 10:06:53.029967    3827 fix.go:112] recreateIfNeeded on ha-393000-m04: state=Stopped err=<nil>
	I0731 10:06:53.029982    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	W0731 10:06:53.030076    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:53.051730    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m04" ...
	I0731 10:06:53.093566    3827 main.go:141] libmachine: (ha-393000-m04) Calling .Start
	I0731 10:06:53.093954    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.094004    3827 main.go:141] libmachine: (ha-393000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid
	I0731 10:06:53.094113    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Using UUID 8a49f5e0-ba79-41ac-9a76-c032dc065628
	I0731 10:06:53.120538    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Generated MAC d2:d8:fb:1d:1:ee
	I0731 10:06:53.120559    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:53.120750    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120805    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120864    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "8a49f5e0-ba79-41ac-9a76-c032dc065628", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:53.120909    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 8a49f5e0-ba79-41ac-9a76-c032dc065628 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:53.120925    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:53.122259    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Pid is 3870
	I0731 10:06:53.122766    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 0
	I0731 10:06:53.122781    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.122872    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3870
	I0731 10:06:53.125179    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 10:06:53.125242    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:53.125254    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:06:53.125266    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:53.125273    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:53.125280    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:53.125287    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found match: d2:d8:fb:1d:1:ee
	I0731 10:06:53.125295    3827 main.go:141] libmachine: (ha-393000-m04) DBG | IP: 192.169.0.8
	I0731 10:06:53.125358    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetConfigRaw
	I0731 10:06:53.126014    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:06:53.126188    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.126707    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:53.126722    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.126959    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:06:53.127071    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:06:53.127158    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127274    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127389    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:06:53.127538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:53.127705    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:06:53.127713    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:53.131247    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:53.140131    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:53.141373    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.141406    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.141429    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.141447    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.528683    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:53.528699    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:53.643451    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.643474    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.643483    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.643491    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.644344    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:53.644357    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:59.241509    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:59.241622    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:59.241636    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:59.265250    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:07:04.190144    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:07:04.190159    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190326    3827 buildroot.go:166] provisioning hostname "ha-393000-m04"
	I0731 10:07:04.190338    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190427    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.190528    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.190617    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190711    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190826    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.190962    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.191110    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.191119    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m04 && echo "ha-393000-m04" | sudo tee /etc/hostname
	I0731 10:07:04.259087    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m04
	
	I0731 10:07:04.259102    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.259236    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.259339    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259439    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.259647    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.259797    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.259811    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:07:04.323580    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:07:04.323604    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:07:04.323616    3827 buildroot.go:174] setting up certificates
	I0731 10:07:04.323623    3827 provision.go:84] configureAuth start
	I0731 10:07:04.323630    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.323758    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:04.323858    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.323932    3827 provision.go:143] copyHostCerts
	I0731 10:07:04.323960    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324021    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:07:04.324027    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324150    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:07:04.324352    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324397    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:07:04.324402    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324482    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:07:04.324627    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324668    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:07:04.324674    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324752    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:07:04.324900    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m04 san=[127.0.0.1 192.169.0.8 ha-393000-m04 localhost minikube]
	I0731 10:07:04.518738    3827 provision.go:177] copyRemoteCerts
	I0731 10:07:04.518793    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:07:04.518809    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.518951    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.519038    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.519124    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.519202    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:04.553750    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:07:04.553834    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:07:04.574235    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:07:04.574311    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:07:04.594359    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:07:04.594433    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:07:04.614301    3827 provision.go:87] duration metric: took 290.6663ms to configureAuth
	I0731 10:07:04.614319    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:07:04.614509    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:04.614526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:04.614676    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.614777    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.614880    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.614987    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.615110    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.615236    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.615386    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.615394    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:07:04.672493    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:07:04.672505    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:07:04.672600    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:07:04.672612    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.672752    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.672835    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.672958    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.673042    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.673159    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.673303    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.673352    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:07:04.741034    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	Environment=NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:07:04.741052    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.741187    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.741288    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741387    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741494    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.741621    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.741755    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.741771    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:07:06.325916    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:07:06.325931    3827 machine.go:97] duration metric: took 13.199216588s to provisionDockerMachine
	I0731 10:07:06.325941    3827 start.go:293] postStartSetup for "ha-393000-m04" (driver="hyperkit")
	I0731 10:07:06.325948    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:07:06.325960    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.326146    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:07:06.326163    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.326257    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.326346    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.326438    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.326522    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.369998    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:07:06.375343    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:07:06.375359    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:07:06.375470    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:07:06.375663    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:07:06.375669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:07:06.375894    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:07:06.394523    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:06.415884    3827 start.go:296] duration metric: took 89.928396ms for postStartSetup
	I0731 10:07:06.415906    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.416074    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:07:06.416088    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.416193    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.416287    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.416381    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.416451    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.451487    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:07:06.451545    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:07:06.482558    3827 fix.go:56] duration metric: took 13.464545279s for fixHost
	I0731 10:07:06.482584    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.482724    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.482806    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482891    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482992    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.483122    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:06.483263    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:06.483270    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:07:06.539713    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445626.658160546
	
	I0731 10:07:06.539725    3827 fix.go:216] guest clock: 1722445626.658160546
	I0731 10:07:06.539731    3827 fix.go:229] Guest: 2024-07-31 10:07:06.658160546 -0700 PDT Remote: 2024-07-31 10:07:06.482574 -0700 PDT m=+124.148842929 (delta=175.586546ms)
	I0731 10:07:06.539746    3827 fix.go:200] guest clock delta is within tolerance: 175.586546ms
	I0731 10:07:06.539751    3827 start.go:83] releasing machines lock for "ha-393000-m04", held for 13.521760862s
	I0731 10:07:06.539766    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.539895    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:06.564336    3827 out.go:177] * Found network options:
	I0731 10:07:06.583958    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	W0731 10:07:06.605128    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605143    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605170    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605183    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605593    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605717    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605786    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:07:06.605816    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	W0731 10:07:06.605831    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605845    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605864    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605930    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:07:06.605931    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.605944    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.606068    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606081    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.606172    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606197    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606270    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606322    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.606369    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	W0731 10:07:06.638814    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:07:06.638878    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:07:06.685734    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:07:06.685752    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.685831    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:06.701869    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:07:06.710640    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:07:06.719391    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:07:06.719452    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:07:06.728151    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.736695    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:07:06.745525    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.754024    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:07:06.762489    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:07:06.770723    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:07:06.779179    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:07:06.787524    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:07:06.795278    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:07:06.802833    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:06.908838    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:07:06.929085    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.929153    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:07:06.946994    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.958792    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:07:06.977007    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.987118    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:06.998383    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:07:07.019497    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:07.030189    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:07.045569    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:07:07.048595    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:07:07.055870    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:07:07.070037    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:07:07.166935    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:07:07.272420    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:07:07.272447    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:07:07.286182    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:07.397807    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:07:09.678871    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.281044692s)
	I0731 10:07:09.678935    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:07:09.691390    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:07:09.706154    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:09.718281    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:07:09.818061    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:07:09.918372    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.020296    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:07:10.034132    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:10.045516    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.140924    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:07:10.198542    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:07:10.198622    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:07:10.202939    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:07:10.203007    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:07:10.206254    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:07:10.238107    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:07:10.238184    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.256129    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.301307    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:07:10.337880    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:07:10.396169    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:07:10.454080    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	I0731 10:07:10.491070    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:10.491478    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:07:10.496573    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:10.506503    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:07:10.506687    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:10.506931    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.506954    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.515949    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52091
	I0731 10:07:10.516322    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.516656    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.516668    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.516893    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.517004    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:07:10.517099    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:07:10.517181    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:07:10.518192    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:07:10.518454    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.518477    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.527151    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52093
	I0731 10:07:10.527586    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.527914    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.527931    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.528158    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.528268    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:07:10.528367    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.8
	I0731 10:07:10.528374    3827 certs.go:194] generating shared ca certs ...
	I0731 10:07:10.528388    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:07:10.528576    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:07:10.528655    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:07:10.528666    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:07:10.528692    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:07:10.528712    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:07:10.528731    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:07:10.528834    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:07:10.528887    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:07:10.528897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:07:10.528933    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:07:10.528968    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:07:10.529000    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:07:10.529077    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:10.529114    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.529135    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.529152    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.529176    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:07:10.550191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:07:10.570588    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:07:10.590746    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:07:10.611034    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:07:10.631281    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:07:10.651472    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:07:10.671880    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:07:10.676790    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:07:10.685541    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689430    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689496    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.694391    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:07:10.703456    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:07:10.712113    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715795    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.720285    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:07:10.728964    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:07:10.737483    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741091    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741135    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.745570    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:07:10.754084    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:07:10.757225    3827 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 10:07:10.757258    3827 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.30.3 docker false true} ...
	I0731 10:07:10.757327    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:07:10.757375    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.764753    3827 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 10:07:10.764797    3827 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 10:07:10.772338    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 10:07:10.772398    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:07:10.772434    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772437    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.780324    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 10:07:10.780354    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 10:07:10.780356    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 10:07:10.780369    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 10:07:10.799303    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.799462    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.842469    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 10:07:10.842511    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 10:07:11.478912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0731 10:07:11.486880    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:07:11.501278    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:07:11.515550    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:07:11.518663    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:11.528373    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.625133    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:11.645175    3827 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0731 10:07:11.645375    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:11.651211    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:07:11.692705    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.797111    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:12.534860    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:07:12.535084    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:07:12.535128    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:07:12.535291    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:07:12.535335    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:12.535339    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:12.535359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:12.535366    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:12.537469    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.035600    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.035613    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.035620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.035622    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.037811    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.536601    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.536621    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.536630    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.536636    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.539103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.035926    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.035943    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.035952    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.035957    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.038327    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.535691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.535719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.538107    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.538174    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:15.035707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.035739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.037991    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:15.535587    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.535602    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.535658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.535663    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.537787    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.035475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.035497    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.035550    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.035555    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.037882    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.536666    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.536687    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.536712    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.536719    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.538904    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:17.035473    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.035488    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.035495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.035498    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.037610    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:17.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.536074    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.536089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.536096    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.539102    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.035624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.035646    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.035652    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.037956    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.535491    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.535589    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.535603    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.535610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.538819    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:18.538965    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:19.036954    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.037007    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.037028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.037033    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.039345    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:19.536847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.536862    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.536870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.536873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.538820    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.037064    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.037079    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.037086    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.037089    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.038945    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.536127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.536138    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.536145    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.536150    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:21.036613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.036684    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.036695    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.036701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.039123    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:21.039186    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:21.536684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.536700    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.536705    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.536708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.538918    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:22.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.036736    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.036743    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.036746    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.038627    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:22.536686    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.536704    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.536714    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.536718    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.538549    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:23.036470    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.036482    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.036489    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.036494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.038533    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:23.535581    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.535639    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.535653    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.535667    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.539678    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:23.539740    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:24.036874    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.036948    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.036959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.036965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.039843    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:24.536241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.536307    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.536318    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.536323    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.538807    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.036279    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.036343    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.036356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.036362    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.038454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.535942    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.535954    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.535962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.535967    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.538068    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.036823    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.036838    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.036845    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.036848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.038942    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.039008    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:26.535480    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.535499    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.535533    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:27.036202    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.036213    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.036219    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.036222    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.038071    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:27.537206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.537226    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.537236    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.537248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.539573    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.036203    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.036217    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.036223    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.036225    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.038017    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:28.536971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.536988    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.536998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.537003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.539378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.539442    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:29.035655    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.035667    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.035673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.035676    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.037786    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:29.537109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.537124    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.537144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.539430    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:30.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.035905    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.035908    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.037803    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:30.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.535701    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.535718    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.539029    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:31.036151    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.036166    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.036175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.038532    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:31.038593    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:31.536698    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.536710    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.538484    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.037162    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.037178    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.037185    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.037188    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.039081    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.536065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.536085    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.536095    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.536099    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.538365    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.036492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.036513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.036523    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.036527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.038851    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.038919    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:33.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.535576    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.537575    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:34.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.036912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.036923    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.036932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.040173    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:34.535858    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:35.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.036670    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.036677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.036682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.038861    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:35.038930    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:35.535814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.535827    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.535835    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.535840    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.538360    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.035769    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.035785    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.038202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.535426    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.535438    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.535445    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.535449    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.537303    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:37.035456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.035470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.035479    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.035483    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.037630    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.536548    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.536562    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.536568    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.536572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.538659    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.538720    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:38.036407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.036421    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.036427    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.036432    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.038467    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:38.537359    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.537378    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.537387    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.537392    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.539892    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:39.036414    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.036470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.036495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:39.535817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.535832    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.535839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.537796    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.035880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.035896    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.035906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.037712    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:40.535492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.535523    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.535536    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.538475    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:41.035745    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.035758    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.035770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.035774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:41.535726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.535738    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.535744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.535747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.537897    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.036564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.036573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.039525    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.039600    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:42.535450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.535465    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.537399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:43.035576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.035592    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.035598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.035602    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:43.536787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.536822    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.536832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.536837    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.539146    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.036148    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.036161    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.036169    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.036173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.038382    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.536653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.536709    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.538695    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:44.538753    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:45.036650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.036662    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.036668    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.036672    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.038555    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:45.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.535582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.535590    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.538335    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.035712    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.035740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.038035    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.535534    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.535557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.535564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.537974    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:47.035871    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.035887    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.035893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.035897    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.037864    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:47.037931    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:47.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.535564    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.535570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.535573    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.537590    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:48.035461    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.035531    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.035539    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.035543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.037510    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:48.536520    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.536535    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.536541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.536544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.538561    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:49.035436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.035448    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.035454    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.035458    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.037204    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.535574    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.535592    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.535595    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.537443    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.537505    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:50.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.036547    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.036566    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.038478    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:50.536624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.536636    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.536642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.538734    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.036016    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.036035    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.036044    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.036049    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.038643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.536662    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.536677    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.536686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.536691    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.539033    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.539099    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:52.036475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.036490    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.036499    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.036503    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.038975    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:52.537013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.537034    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.537041    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.537045    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.539229    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.037093    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.037106    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.037113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.037117    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.039169    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.536468    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.536478    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.536486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.539425    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.539565    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:54.035597    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.035609    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.035615    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.037574    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:54.535484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.535503    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.535509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.537529    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.036258    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.036270    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.036277    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.036280    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.038186    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:55.536493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.536513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.536533    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.539517    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.539589    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:56.035565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.035586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.035599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.040006    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:07:56.536361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.536374    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.536380    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.536383    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.538540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:57.036446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.036544    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.036567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.039754    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:57.536620    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.536630    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.536637    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.536639    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.538482    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:58.036499    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.036518    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.036527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.036532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.039244    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:58.039325    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:58.537076    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.537105    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.537197    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.537204    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.539718    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:59.037046    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.037127    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.037142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.037149    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.040197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:59.536758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.536790    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.536798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.536802    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.538842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.035440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.035453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.035460    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.035463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.037506    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.536873    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.536895    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.536906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.536913    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.540041    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:00.540123    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:01.036175    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.036225    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.036239    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.036248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.039214    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:01.535960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.535973    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.535979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.535983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.538089    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.036835    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.036856    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.036875    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.039802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.536660    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.536667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.536670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.538840    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.036159    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.036181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.036184    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.038276    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.038354    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:03.536974    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.536990    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.536996    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.537000    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.538828    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:04.036300    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.036363    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.036391    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.038707    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:04.535718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.535737    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.535749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.538366    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.036299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.036316    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.036350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.036354    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.038568    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:05.535824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.535837    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.535846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.537780    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:06.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.036592    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.036607    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.036612    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.038642    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:06.535656    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.535670    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.535679    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.535682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.538248    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.036322    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.036396    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.036407    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.036412    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.038943    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.039003    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:07.536357    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.536370    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.536379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.536384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.538778    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.036360    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.036375    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.036381    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.036384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.038393    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.536197    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.536266    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.538997    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.036883    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.036911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.036918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.036922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.039071    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.039137    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:09.535649    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.535664    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.535673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.535677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.537998    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.036229    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.036241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.036247    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.039273    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.536564    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.536575    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.536585    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.538369    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:11.036693    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.036710    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.036749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.038831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.535438    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.535452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.535461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.537490    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.537597    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:12.035786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.035805    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.035812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.035816    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.038145    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:12.536840    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.536858    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.536868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.536881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.538815    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.037034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.037049    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.037056    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.037059    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.038933    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.535502    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.535519    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.535593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.537560    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.537648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:14.036280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.036300    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.036312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.036322    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.039000    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:14.535507    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.535527    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.535537    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.538228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.036543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.036634    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.036643    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.039762    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:15.535993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.536006    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.536012    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.536015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.538186    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.538254    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:16.035582    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.035595    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.035602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:16.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.536663    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.536709    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.536713    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.538604    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:17.036351    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.036372    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.036393    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.039451    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:17.536542    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.536560    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.536573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.539454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:17.539591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:18.036512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.036578    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.036588    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.038886    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:18.535537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.535554    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.535559    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.537559    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:19.035943    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.035968    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.035980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.035987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.038665    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:19.536893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.536911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.536920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.536925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.539416    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.036463    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.036479    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.036495    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.036500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.038824    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.038907    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:20.536286    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.536306    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.536313    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.536316    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.538429    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.036034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.036055    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.038101    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.535690    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.535732    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.535740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.538264    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.036592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.036604    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.036610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.038773    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.536090    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.536103    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.536109    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.536114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.537988    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:22.538057    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:23.035526    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.035555    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.035562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.035567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.037480    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:23.536652    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.536666    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.536673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.536677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.538667    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.036746    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.036766    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.036778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.036789    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.039353    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:24.536440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.536452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.536459    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.536463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.538250    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.538315    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:25.036622    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.036643    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.036656    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.036666    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.039764    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:25.535710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.535721    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.535737    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.535742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:26.036253    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.036276    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.036338    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.036343    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.038674    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.536815    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.536828    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.536834    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.536838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.538932    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:27.035852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.035864    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.035869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.035872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.038024    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:27.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.536016    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.536028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.536036    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.539189    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:28.035934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.036002    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.036011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.036014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.037996    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:28.535538    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.535554    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.535561    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.535563    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:29.037018    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.037032    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.037039    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.037042    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.038983    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:29.039043    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:29.535757    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.535769    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.535775    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.535778    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.537697    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:30.036529    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.036548    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.036557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.038833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:30.535560    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.535570    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.535576    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.535579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.537657    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.035508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.035520    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.035527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.035531    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.037575    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.536786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.536800    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.536806    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.536809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.538674    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:31.538731    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:32.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.035833    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.035842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.035848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.038170    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:32.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.535471    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.535481    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.537802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.037123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.037156    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.037166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.037171    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.039252    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.535754    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.535760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.535763    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.537979    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.035638    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.035651    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.035658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.035661    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.037722    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:34.535808    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.535823    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.535831    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.535834    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.538223    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:35.036584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.036609    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.036620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.036625    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.039788    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:35.535720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.535732    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.535738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.535741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.537506    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:36.036439    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.036484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.036492    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.036498    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.038534    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:36.038591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:36.535446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.535465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.535467    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.537309    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:37.035737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.035776    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.035789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.035794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.037928    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:37.535410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.535422    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.535430    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.535433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.537627    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.036658    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.036738    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.036760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.039378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:38.535459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.535474    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.535490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.535494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.537817    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.036931    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.036949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.036957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.036962    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.039286    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.536472    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.536487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.536491    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.538440    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:40.036354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.036378    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.036463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.036469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.535847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.535866    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.535883    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.538740    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.538822    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:41.036206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.036229    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.036234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.038292    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:41.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.535753    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.535764    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.537837    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.036558    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.036566    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.036570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.039104    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.536474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.536484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.536491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.538339    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:43.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.035913    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.035925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.035931    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.038963    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:43.039028    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:43.537036    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.537050    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.537056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.537059    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.539282    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:44.035937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.035949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.035954    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.035958    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.037693    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:44.536399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.536470    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.536481    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.536485    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.538818    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.036937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.036960    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.036966    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.039449    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:45.535403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.535415    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.535421    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.535424    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.537208    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:46.037001    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.037088    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.037104    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.037110    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.040342    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:46.536255    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.536269    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.538801    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:47.037251    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.037286    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.037297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.037304    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.039048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.537021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.537064    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.537071    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.537076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.539084    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.539154    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:48.037354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.037369    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.037376    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.037379    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.039646    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:48.536219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.536236    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.536272    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.536276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.538242    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:49.035446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.035459    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.035465    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.035469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.037563    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:49.535517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.535540    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.537433    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:50.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.036659    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.036665    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.036670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.038735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:50.038803    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:50.535659    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.535678    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.535690    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.535697    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.538598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.036768    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.036782    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.036789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.036794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.038898    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.536592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.536608    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.536616    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.536621    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.539087    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:52.036618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.036639    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.036652    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.036658    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.039828    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:52.039911    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:52.535902    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.537950    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.036705    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.036716    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.036721    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.039002    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.535467    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.535473    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.535476    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.537615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.036291    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.036325    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.036406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.036414    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.039211    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.535751    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.535763    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.535769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.535772    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.537488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:54.537606    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:55.036966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.036982    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.036988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.038791    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:55.537260    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.537303    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.537312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.537315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.539579    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.036346    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.036359    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.036367    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.036370    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.038527    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.536015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.536055    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.536063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.536068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:56.538106    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:57.036625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.036637    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.036646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.038481    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:57.536731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.536744    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.536749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:58.037081    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.037160    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.037174    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.037182    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.040222    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:58.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.535453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.535460    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.535463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.537373    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:59.037130    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.037151    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.037161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.037181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.039237    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:59.039342    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:59.536756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.536768    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.536774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.536777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.538430    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:00.036701    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.036714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.036720    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.036723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.038842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:00.535558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.535574    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.535620    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.535625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.537993    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.036274    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.036293    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.036302    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.036305    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.038700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.536455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.536488    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.536511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.538672    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.538736    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:02.036272    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.036286    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.036291    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.036295    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:02.535392    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.535405    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.535416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.535419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.537336    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.036249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.036264    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.036271    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.036276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.038181    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.536990    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.537012    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.537020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.537024    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.541054    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:09:03.541125    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:04.036809    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.036887    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.036896    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.036902    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.039202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:04.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.537152    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.537166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.537904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.540615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.036817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.036832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.036838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.036842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.038865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.535412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.535430    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.535438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.535446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.538103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.036140    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.036160    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.036172    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.039025    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:06.536908    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.536923    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.536930    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.536933    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.538854    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:07.035951    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.035965    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.035974    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.035979    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.038105    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:07.535618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.535629    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.535635    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.535637    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.537552    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:08.036184    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.036212    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.036273    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.036279    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.038850    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.536040    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.536056    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.536065    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.536069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.538402    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.538460    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:09.036971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.037018    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.037025    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.037031    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.039100    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:09.535468    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.535480    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.535490    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.537589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.035464    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.035479    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.035491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.035506    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.037831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.536550    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.536622    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.536632    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.536638    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.539005    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.539064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:11.037316    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.037399    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.037415    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.037425    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.040113    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:11.536965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.536989    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.537033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.537044    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.539689    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:12.036399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.036469    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.036480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.036486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.038399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:12.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.535463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.535486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.539207    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:12.539333    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:13.036110    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.036220    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.036236    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:13.535970    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.535990    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.536002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.536008    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.539197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:14.037193    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.037263    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.037274    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.037286    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.039603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:14.535571    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.535594    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.036611    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.036630    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.036642    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.036648    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.039592    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.039739    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:15.535565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.535590    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.535602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.535608    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.539127    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.035884    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.035904    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.035915    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.035919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.038938    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.535882    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.535893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.535900    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.535904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.537836    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:17.036590    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.036605    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.036618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.039082    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:17.535436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.535454    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:17.539295    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:18.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.035505    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.035509    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.037946    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:18.536869    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.536890    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.538941    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:19.035847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.035859    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.035865    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.035868    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.037761    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:19.536117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.536142    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.536154    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.536160    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:19.539466    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:20.036919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.036993    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.037004    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.037009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.039230    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:20.536619    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.536716    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.536731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.536738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.539591    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.036024    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.036114    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.036129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.036136    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.038666    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.535434    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.535447    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.535453    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.535457    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.537251    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:22.037204    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.037219    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.037228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.037234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.039524    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:22.039581    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:22.536431    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.536450    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.536464    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.536473    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.539233    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.035562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.035606    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.035627    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.035634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.037971    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.536675    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.536742    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.539879    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:24.035514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.035529    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.035535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.035544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.037431    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:24.536058    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.536156    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.536171    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.536179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.538730    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:24.538810    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:25.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.036804    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.036814    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.036821    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.039117    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:25.535569    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.535587    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.535596    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.538114    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.035517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.035542    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.035556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.035562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.038485    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.536365    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.536379    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.536386    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.536390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.538690    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:27.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.036652    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.036703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.036709    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.038432    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:27.038498    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:27.535539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.535560    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.535580    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.538434    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.035626    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.035644    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.035647    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.037699    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.536177    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.536199    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.536212    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.536217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.539218    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:29.036925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.036950    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.036962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.036969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.040007    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:29.040064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:29.537194    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.537209    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.537228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.537240    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.539598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.036373    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.036494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.039302    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.536789    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.536807    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.536815    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.536820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.539885    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.036624    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.036635    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.039815    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.536285    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.536295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.536301    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.538680    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:31.538744    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:32.036451    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.036463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.036469    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.036472    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.038847    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:32.536969    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.537019    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.537032    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.537041    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.539636    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.035557    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.035573    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.035582    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.035587    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.535485    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.535509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.535522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.535529    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.538268    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.035811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.035830    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.035841    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.035846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.038580    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.038645    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:34.535515    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.535562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.537523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.036865    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.036880    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.036887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.036890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.038894    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.535476    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.535574    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.535579    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.537495    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:36.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.036227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.038994    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:36.039061    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:36.536105    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.536117    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.536124    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.536127    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.036134    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.536082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.536101    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.536110    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.536114    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.538459    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.035493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.035509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.035517    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.035524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.037791    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.535613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.535632    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.535645    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.535668    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.539185    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:38.539281    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:39.036660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.036682    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.036693    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.036700    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.039452    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:39.535986    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.536000    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.536007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.536011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.537968    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:40.036939    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.037010    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.037021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.037026    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.039435    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:40.536149    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.536171    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.536233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.536239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.538338    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.036629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.036641    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.036647    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.036651    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.038835    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.038897    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:41.536269    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.536280    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.536287    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.536290    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.538277    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:42.036495    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.036511    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.036520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.036524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.038560    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:42.537182    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.537201    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.537210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.537215    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.539833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.035857    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.035874    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.035881    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.035891    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.038530    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.536377    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.536465    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.536480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.536488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.539159    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.539217    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:44.036979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.037065    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.037081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.037089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.039312    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:44.536993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.537011    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.537018    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.537063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.539131    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.036929    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.037050    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.037064    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.039700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.537112    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.537123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.539940    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.540011    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:46.036811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.036857    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.036882    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.039540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:46.535831    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.535845    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.535852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.535856    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.538387    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:47.036117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.036128    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.036134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.036137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.037871    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:47.536504    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.536553    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.536564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.536568    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:48.036960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.036980    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.036998    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.040512    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:48.041066    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:48.535514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.535532    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.535542    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.535547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.537881    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.036133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.038899    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.536876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.536893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.536899    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.538675    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.037190    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.037204    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.037213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.037216    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.039015    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.536824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.536920    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.536935    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.536942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.539735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:50.539808    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:51.035683    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.035696    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.035702    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.035706    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.038883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:51.536861    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.536882    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.536894    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.536901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.539779    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:52.035474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.035485    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.035493    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.035499    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.037401    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:52.536642    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.536661    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.536669    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.536674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.036427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.036482    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.036487    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.038951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.039010    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:53.535427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.535450    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.537257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:54.036806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.036828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.036832    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.039021    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:54.535805    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.535897    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.535912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.535919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.538990    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:55.036521    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.036539    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.036546    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.036549    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.038766    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.536714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.536723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.536727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.539055    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.539163    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:56.035522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.035534    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.035541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.035545    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:56.535916    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.535934    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.535943    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.535949    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.538329    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:57.036391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.036406    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.036413    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.036417    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.038267    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:57.535390    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.535447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.535452    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.537243    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.036778    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.036805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.036809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.038620    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.038682    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:58.536471    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.536516    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.536532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.538643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:59.035837    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.035851    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.035858    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.035861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.037705    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:59.536730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.536832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.536848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.536854    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.539682    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.035558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.035587    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.035600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.035612    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.037523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:00.535512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.535528    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.535534    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.535537    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.537603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.537667    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:01.036888    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.036943    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.036951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.036955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.038774    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:01.535488    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.535504    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.535513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.535517    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.538017    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.036031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.036054    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.037488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:02.537218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.537285    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.537295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.537300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.539559    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.539701    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:03.036241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.036256    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.036263    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.036269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.037763    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:03.536877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.536892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.536901    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.539168    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:04.035721    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.035733    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.035739    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.035742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.037607    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:04.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.535694    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.535703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.535707    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.537920    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:05.037180    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.037195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.037201    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.037205    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:05.038947    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:05.536233    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.536248    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.536254    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.536258    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.538191    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.036830    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.036845    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.036852    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.036856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.038427    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.536722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.536735    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.536741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.536753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.538631    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.036171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.036186    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.036192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.036195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.038330    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:07.536466    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.536481    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.536488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.538446    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.538510    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:08.036787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.036832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.036853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.039084    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:08.535567    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.535582    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.535589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.535593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.537711    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.035421    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.035432    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.035438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.035442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.037921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.535887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.535904    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.535913    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.535943    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.538516    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.538592    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:10.035458    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.035469    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.035474    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.035477    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.038652    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:10.535979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.535992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.535998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.536002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.537981    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:11.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.035886    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.035897    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.035901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.038043    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:11.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.535487    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.535494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.535497    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.537395    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:12.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.036591    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.036598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.036601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.038621    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:12.038676    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:12.536927    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.536941    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.536947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.536952    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.539050    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:13.036386    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.036399    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.036428    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.036433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.038022    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:13.536356    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.536376    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.536403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.536406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.035960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.035973    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.035979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.035983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.037566    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.535889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.535909    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.535920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.535926    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:14.538873    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:15.037263    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.037278    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.037284    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.037291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.038934    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:15.535930    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.535949    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.535957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.535961    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.538412    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:16.035774    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.035790    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.035798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.035803    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.037617    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:16.536338    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.536352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.536359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.536362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.538545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.036602    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.036625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.039042    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:17.535886    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.535901    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.535907    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.535910    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.538060    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:18.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.036938    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.036947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.036950    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.038702    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:18.535556    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.535580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.535586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.537620    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.035993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.036009    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.036017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.036021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.536410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.536433    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.536444    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.536452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.539613    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:19.539694    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:20.035430    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.035445    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.035456    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.035466    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.037008    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:20.536812    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.536836    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.536849    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.536855    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.539846    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.035746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.035755    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.035761    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.037893    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.536119    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.536158    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.536173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.536181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.035742    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.035796    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.038072    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.038175    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:22.536977    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.536992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.536999    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.537002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.539319    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:23.036522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.036538    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.036544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.036547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.038326    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:23.537176    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.537194    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.537202    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.537208    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.539537    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:24.036672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.036686    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.036692    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.036696    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.038290    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:24.038347    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:24.536490    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.536508    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.536519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.536525    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.539462    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:25.036309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.036323    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.036329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.036332    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.038173    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:25.535523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.535539    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.535547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.535552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.538454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:26.035663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.035681    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.035719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.035722    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.037593    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.536821    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.536893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.538841    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.538912    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:27.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.036734    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.036740    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.036743    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.038648    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:27.537059    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.537079    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.537111    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.537116    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.539595    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:28.035398    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.035411    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.035417    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.035421    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.037116    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:28.536047    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.536115    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.536125    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.536133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.538589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.036033    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.036048    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.036055    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.036058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.038794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.038860    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:29.536173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.536187    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.536193    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.536198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.538161    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:30.036950    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.037050    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.037065    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.037072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.039996    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:30.536407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.536424    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.036484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.036581    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.036600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.039439    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:31.535848    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.535863    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.535872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.036070    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.036083    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.036092    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.036097    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.038358    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.535559    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.535583    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.535597    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.535604    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.538962    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:33.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.035880    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.035887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.035890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.038234    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:33.536345    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.536363    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.536408    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.536413    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.538408    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:33.538470    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:34.035876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.035911    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.035917    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.038813    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:34.535532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.535555    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.535599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.535611    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.036525    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.036545    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.036557    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.036565    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.039453    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.536317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.536338    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.536346    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.536351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.538546    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.538604    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:36.035614    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.035632    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.035642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.035648    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.037951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:36.535593    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.535610    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.535620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.535627    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.538091    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.035952    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.035972    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.035984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.035992    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.039078    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:37.536397    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.536416    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.536425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.536431    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.538652    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.538721    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:38.036647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.036688    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.036697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.036702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.038657    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:38.535391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.535469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.535474    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.537747    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:39.036877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.036896    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.036908    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.036916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.039937    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.537361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.537463    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.537475    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.537480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.540492    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.540575    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:40.035736    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.035759    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.035797    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.035817    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.038896    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:40.536124    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.536136    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.536142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.536147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.538082    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:41.036456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.036502    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.036513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.036519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.038631    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:41.535516    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.535529    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.535535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.035758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.035795    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.038565    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.038648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:42.536775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.536801    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.536856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.536867    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.539883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:43.036733    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.036747    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.036754    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.036758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.038792    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:43.536704    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.536719    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.536725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.536730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.538830    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.037317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.037342    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.037351    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.037356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.040355    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.040430    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:44.537337    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.537352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.537358    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.537362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.539426    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.036153    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.036187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.036193    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.039178    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.535572    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.535584    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.535596    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.537420    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:46.037146    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.037161    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.037168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.037199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.039539    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.536761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.536842    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.536857    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.536863    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.539600    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.539683    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:47.037209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.037228    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.037237    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.037243    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.039381    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:47.536097    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.536127    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.536138    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.536143    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.540045    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:48.035580    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.035598    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.035610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.037609    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:48.535945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.535960    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.535966    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.535969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.537852    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:49.036904    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.036928    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.036941    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.036946    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.039794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:49.039868    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:49.536635    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.536649    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.536699    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.536704    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.035497    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.035500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.037398    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.536222    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.536321    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.536335    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.536342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.035748    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.035813    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.035820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.536457    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.536471    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.536480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.538865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.538935    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:52.036481    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.036503    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.036583    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.036593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.039545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:52.536583    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.536620    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.536636    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.539115    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:53.037214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.037226    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.037256    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.037262    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.039257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:53.535880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.535892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.535898    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.535901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.538097    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.035680    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.035691    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.035697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.035702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.037758    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.037819    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:54.536181    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.536195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.536250    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.536256    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.538069    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:55.036750    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.036858    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.036874    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.036881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.040140    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:55.535731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.535746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.535752    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.535755    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.537710    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:56.037367    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.037382    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.037392    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.037396    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.039716    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:56.039828    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:56.535738    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.535750    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.535757    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.535760    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.537553    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:57.036797    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.036852    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.036859    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.036862    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.038921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:57.535419    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.535437    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.535452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.035459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.035475    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.035484    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.035488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.037963    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.536607    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.536625    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.536640    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.536653    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.539173    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.539233    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:59.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.035890    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.035912    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:59.535411    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.535426    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.535432    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.535434    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.537913    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.036663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.036679    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.036686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.036690    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.038915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.536586    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.536602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.536610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.536615    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.538823    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.037017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.037041    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.037053    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.037058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.039885    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.039956    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:01.537010    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.537022    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.537028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.537032    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.538870    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:02.036801    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.036819    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.036827    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.036831    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.039277    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:02.535479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.535495    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.535501    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.535505    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.537168    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:03.037023    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.037069    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.037079    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.037084    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.536060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.536073    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.536079    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.536083    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.539021    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:04.036364    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.036379    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.036390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:04.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.536251    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.536260    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.536264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.538409    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:05.035688    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.035701    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.035708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.035712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.037474    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:05.535639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.535661    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.535671    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.535676    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.538235    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.036540    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.036564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.039139    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.039201    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:06.536852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.536867    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.536875    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.536879    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.539160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:07.037400    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.037412    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.037419    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.037422    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.039316    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:07.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.535496    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.535507    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.538665    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:11:08.035588    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.035602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.035609    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.035614    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.037450    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.535606    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.535617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.535624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.535628    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.537643    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.537700    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:09.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.036549    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.036556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.036560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.038511    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:09.536726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.536794    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.536805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.536810    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.036626    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.038891    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.535919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.535991    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.536003    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.536009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.538198    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.538256    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:11.035775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.035789    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.037602    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:11.535963    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.535977    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.535984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.535988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.035422    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.035494    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.035509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.035514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.037902    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.536484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.536500    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.536506    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.536510    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.538333    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:12.538392    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:12.538407    3827 node_ready.go:38] duration metric: took 4m0.003142979s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:11:12.560167    3827 out.go:177] 
	W0731 10:11:12.580908    3827 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0731 10:11:12.580926    3827 out.go:239] * 
	W0731 10:11:12.582125    3827 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:11:12.680641    3827 out.go:177] 
	
	
	==> Docker <==
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.914226423Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928630776Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928700349Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928854780Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.929029367Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.930900389Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.930985608Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.931085246Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.931220258Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928429805Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.933866106Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.933878374Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.934267390Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953115079Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953269656Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953688559Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953968281Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:38 ha-393000 dockerd[1174]: time="2024-07-31T17:06:38.259320248Z" level=info msg="ignoring event" container=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259503796Z" level=info msg="shim disconnected" id=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 namespace=moby
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259817357Z" level=warning msg="cleaning up after shim disconnected" id=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 namespace=moby
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259827803Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937784723Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937892479Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937935988Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.938076078Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	21ff27483d07f       6e38f40d628db                                                                                         4 minutes ago       Running             storage-provisioner       2                   31c959cec2158       storage-provisioner
	7500c837dfe73       8c811b4aec35f                                                                                         5 minutes ago       Running             busybox                   1                   f5579bdb56284       busybox-fc5497c4f-b94zr
	492e11c732d18       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   1                   22a2f7cb99560       coredns-7db6d8ff4d-wvqjl
	26d835568c733       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   1                   8336e3fbaa274       coredns-7db6d8ff4d-5m8st
	193af4895baf9       6f1d07c71fa0f                                                                                         5 minutes ago       Running             kindnet-cni               1                   304fa6a12c82b       kindnet-hjm7c
	4f56054bbee16       55bb025d2cfa5                                                                                         5 minutes ago       Running             kube-proxy                1                   7e638ed37b5ca       kube-proxy-zc52f
	c2de84de71d0d       6e38f40d628db                                                                                         5 minutes ago       Exited              storage-provisioner       1                   31c959cec2158       storage-provisioner
	42b34888f43b4       76932a3b37d7e                                                                                         5 minutes ago       Running             kube-controller-manager   6                   dd7a38b9a9134       kube-controller-manager-ha-393000
	bf0af6a864492       38af8ddebf499                                                                                         5 minutes ago       Running             kube-vip                  1                   7ae512ce66d9e       kube-vip-ha-393000
	0a6a6d756b8d8       76932a3b37d7e                                                                                         5 minutes ago       Exited              kube-controller-manager   5                   dd7a38b9a9134       kube-controller-manager-ha-393000
	a34d35a3b612b       3edc18e7b7672                                                                                         5 minutes ago       Running             kube-scheduler            2                   b550834f339ce       kube-scheduler-ha-393000
	488f4fddc126e       3861cfcd7c04c                                                                                         5 minutes ago       Running             etcd                      2                   35bc88d55a5f9       etcd-ha-393000
	7e0d32286913b       1f6d574d502f3                                                                                         5 minutes ago       Running             kube-apiserver            5                   913ebe1d27d36       kube-apiserver-ha-393000
	aec44315311a1       1f6d574d502f3                                                                                         7 minutes ago       Exited              kube-apiserver            4                   194073f1c5ac9       kube-apiserver-ha-393000
	86018b08bbaa1       3861cfcd7c04c                                                                                         10 minutes ago      Exited              etcd                      1                   ba75e4f4299bf       etcd-ha-393000
	5fcb6f7d8ab78       38af8ddebf499                                                                                         10 minutes ago      Exited              kube-vip                  0                   e6198932cc027       kube-vip-ha-393000
	d088fefe5f8e3       3edc18e7b7672                                                                                         10 minutes ago      Exited              kube-scheduler            1                   f04a7ecd568d2       kube-scheduler-ha-393000
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   14 minutes ago      Exited              busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         16 minutes ago      Exited              coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         16 minutes ago      Exited              coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              17 minutes ago      Exited              kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         17 minutes ago      Exited              kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	
	
	==> coredns [26d835568c73] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45868 - 37816 "HINFO IN 2903702352377705943.3393804209116430399. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.009308312s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[336879232]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.192) (total time: 30001ms):
	Trace[336879232]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.193)
	Trace[336879232]: [30.001669762s] [30.001669762s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[792684680]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.192) (total time: 30002ms):
	Trace[792684680]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.193)
	Trace[792684680]: [30.002844954s] [30.002844954s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[252017809]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.190) (total time: 30004ms):
	Trace[252017809]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.192)
	Trace[252017809]: [30.004125023s] [30.004125023s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [492e11c732d1] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:50203 - 38178 "HINFO IN 6515882504773672893.3508195612419770899. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.008964582s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1731745039]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.202) (total time: 30000ms):
	Trace[1731745039]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.203)
	Trace[1731745039]: [30.000463s] [30.000463s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1820975691]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.203) (total time: 30000ms):
	Trace[1820975691]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.203)
	Trace[1820975691]: [30.00019609s] [30.00019609s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[58591392]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.202) (total time: 30001ms):
	Trace[58591392]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.203)
	Trace[58591392]: [30.001286385s] [30.001286385s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [feda36fb8a03] <==
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               ha-393000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:53:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:11:07 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 17:06:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-393000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 9e10f5eb61854acbaf6547934383ee12
	  System UUID:                2cfe48dd-0000-0000-9b98-537ad9823a95
	  Boot ID:                    b9343713-c701-4963-b11c-cdefca0b39ab
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-b94zr              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 coredns-7db6d8ff4d-5m8st             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     17m
	  kube-system                 coredns-7db6d8ff4d-wvqjl             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     17m
	  kube-system                 etcd-ha-393000                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         17m
	  kube-system                 kindnet-hjm7c                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      17m
	  kube-system                 kube-apiserver-ha-393000             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-controller-manager-ha-393000    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-proxy-zc52f                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-scheduler-ha-393000             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-vip-ha-393000                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m9s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 17m                    kube-proxy       
	  Normal  Starting                 5m6s                   kube-proxy       
	  Normal  Starting                 17m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  17m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  17m                    kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    17m                    kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     17m                    kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           17m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeReady                16m                    kubelet          Node ha-393000 status is now: NodeReady
	  Normal  RegisteredNode           15m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           14m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           12m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeHasSufficientMemory  5m53s (x8 over 5m53s)  kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  Starting                 5m53s                  kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    5m53s (x8 over 5m53s)  kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m53s (x7 over 5m53s)  kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  5m53s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           5m11s                  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           5m3s                   node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           4m34s                  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	
	
	Name:               ha-393000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:55:06 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:11:05 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:27 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-393000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 83c6a2bd65fe41eb8d2ed449f1d84121
	  System UUID:                7863443c-0000-0000-8e8d-bbd47bc06547
	  Boot ID:                    aad47d4e-f7f0-4bd8-87b6-edfb69496407
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-zln22                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 etcd-ha-393000-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         16m
	  kube-system                 kindnet-lcwbs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      16m
	  kube-system                 kube-apiserver-ha-393000-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-controller-manager-ha-393000-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-proxy-cf577                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-scheduler-ha-393000-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-vip-ha-393000-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 5m23s                  kube-proxy       
	  Normal   Starting                 12m                    kube-proxy       
	  Normal   Starting                 16m                    kube-proxy       
	  Normal   NodeAllocatableEnforced  16m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  16m (x8 over 16m)      kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    16m (x8 over 16m)      kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     16m (x7 over 16m)      kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           16m                    node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           15m                    node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           14m                    node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   Starting                 12m                    kubelet          Starting kubelet.
	  Warning  Rebooted                 12m                    kubelet          Node ha-393000-m02 has been rebooted, boot id: febe9487-cc37-4f76-a943-4c3bd5898a28
	  Normal   NodeHasSufficientPID     12m                    kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  12m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  12m                    kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    12m                    kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           12m                    node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   Starting                 5m34s                  kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  5m34s (x8 over 5m34s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m34s (x8 over 5m34s)  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m34s (x7 over 5m34s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  5m34s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           5m11s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           5m3s                   node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           4m34s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	
	
	Name:               ha-393000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:56:18 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:11:11 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:06:25 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:06:25 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:06:25 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:06:25 +0000   Wed, 31 Jul 2024 16:56:40 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-393000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 6bd67d455470412d948a97ba6f8b8a9a
	  System UUID:                451d42a6-0000-0000-8ccb-b8851dda0594
	  Boot ID:                    0d534f8f-f62b-4786-808f-39cb1c1bf961
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-n8d7h                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 etcd-ha-393000-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-s2pv6                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      14m
	  kube-system                 kube-apiserver-ha-393000-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-controller-manager-ha-393000-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-proxy-cr9pg                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-scheduler-ha-393000-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-vip-ha-393000-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 4m46s              kube-proxy       
	  Normal   Starting                 14m                kube-proxy       
	  Normal   NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  14m (x8 over 14m)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    14m (x8 over 14m)  kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     14m (x7 over 14m)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           14m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           14m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           14m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           12m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           5m11s              node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           5m3s               node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   Starting                 4m50s              kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  4m50s              kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  4m50s              kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    4m50s              kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     4m50s              kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 4m50s              kubelet          Node ha-393000-m03 has been rebooted, boot id: 0d534f8f-f62b-4786-808f-39cb1c1bf961
	  Normal   RegisteredNode           4m34s              node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	
	
	==> dmesg <==
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.035849] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.008140] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.683009] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007123] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.689234] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.257015] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000008] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +2.569890] systemd-fstab-generator[473]: Ignoring "noauto" option for root device
	[  +0.101117] systemd-fstab-generator[485]: Ignoring "noauto" option for root device
	[  +1.260537] kauditd_printk_skb: 42 callbacks suppressed
	[  +0.721842] systemd-fstab-generator[1103]: Ignoring "noauto" option for root device
	[  +0.244917] systemd-fstab-generator[1140]: Ignoring "noauto" option for root device
	[  +0.105223] systemd-fstab-generator[1152]: Ignoring "noauto" option for root device
	[  +0.108861] systemd-fstab-generator[1166]: Ignoring "noauto" option for root device
	[  +2.483787] systemd-fstab-generator[1382]: Ignoring "noauto" option for root device
	[  +0.096628] systemd-fstab-generator[1394]: Ignoring "noauto" option for root device
	[  +0.110449] systemd-fstab-generator[1406]: Ignoring "noauto" option for root device
	[  +0.128159] systemd-fstab-generator[1422]: Ignoring "noauto" option for root device
	[  +0.446597] systemd-fstab-generator[1585]: Ignoring "noauto" option for root device
	[  +6.854766] kauditd_printk_skb: 271 callbacks suppressed
	[ +21.847998] kauditd_printk_skb: 40 callbacks suppressed
	[Jul31 17:06] kauditd_printk_skb: 80 callbacks suppressed
	
	
	==> etcd [488f4fddc126] <==
	{"level":"warn","ts":"2024-07-31T17:06:16.678285Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:16.778473Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:16.87759Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:16.978537Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.078586Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.089155Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.090449Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.157185Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.16138Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.177991Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:19.235777Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.7:2380/version","remote-member-id":"cc1c22e219d8e152","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:19.235865Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"cc1c22e219d8e152","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:20.081142Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:20.096152Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:23.237137Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.7:2380/version","remote-member-id":"cc1c22e219d8e152","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:23.237345Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"cc1c22e219d8e152","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:25.082226Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:25.096276Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"info","ts":"2024-07-31T17:06:27.159074Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:06:27.159122Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:06:27.167117Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:06:27.326929Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"cc1c22e219d8e152","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-07-31T17:06:27.327046Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:06:27.348194Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"cc1c22e219d8e152","stream-type":"stream Message"}
	{"level":"info","ts":"2024-07-31T17:06:27.348297Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	
	
	==> etcd [86018b08bbaa] <==
	{"level":"info","ts":"2024-07-31T17:04:54.706821Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:54.70684Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:54.70685Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:04:55.363421Z","caller":"etcdserver/server.go:2089","msg":"failed to publish local member to cluster through raft","local-member-id":"b8c6c7563d17d844","local-member-attributes":"{Name:ha-393000 ClientURLs:[https://192.169.0.5:2379]}","request-path":"/0/members/b8c6c7563d17d844/attributes","publish-timeout":"7s","error":"etcdserver: request timed out"}
	{"level":"warn","ts":"2024-07-31T17:04:55.539618Z","caller":"etcdhttp/health.go:232","msg":"serving /health false; no leader"}
	{"level":"warn","ts":"2024-07-31T17:04:55.539664Z","caller":"etcdhttp/health.go:119","msg":"/health error","output":"{\"health\":\"false\",\"reason\":\"RAFT NO LEADER\"}","status-code":503}
	{"level":"info","ts":"2024-07-31T17:04:56.510556Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.510829Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.511027Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.51112Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.511212Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306509Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306743Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306923Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.307075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.307212Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404702Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404767Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404769Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:04:59.405991Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"info","ts":"2024-07-31T17:05:00.106932Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106958Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106967Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106977Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106982Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	
	
	==> kernel <==
	 17:11:15 up 6 min,  0 users,  load average: 0.35, 0.28, 0.12
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [193af4895baf] <==
	I0731 17:10:29.067316       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:10:39.076203       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:10:39.076233       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:10:39.076359       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:10:39.076416       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:10:39.076488       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:10:39.076498       1 main.go:299] handling current node
	I0731 17:10:49.072088       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:10:49.072129       1 main.go:299] handling current node
	I0731 17:10:49.072141       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:10:49.072146       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:10:49.072341       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:10:49.072369       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:10:59.068725       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:10:59.069056       1 main.go:299] handling current node
	I0731 17:10:59.069318       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:10:59.069379       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:10:59.069623       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:10:59.069809       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:11:09.067717       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:11:09.067835       1 main.go:299] handling current node
	I0731 17:11:09.067855       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:11:09.067865       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:11:09.068308       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:11:09.068348       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:59:40.110698       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:50.118349       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:50.118427       1 main.go:299] handling current node
	I0731 16:59:50.118450       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:50.118464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:50.118651       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:50.118739       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.118883       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:00.118987       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:00.119126       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:00.119236       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.119356       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:00.119483       1 main.go:299] handling current node
	I0731 17:00:10.110002       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:10.111054       1 main.go:299] handling current node
	I0731 17:00:10.111286       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:10.111319       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:10.111445       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:10.111480       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:20.116250       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:20.116442       1 main.go:299] handling current node
	I0731 17:00:20.116458       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:20.116464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:20.116608       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:20.116672       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [7e0d32286913] <==
	I0731 17:05:50.070570       1 controller.go:80] Starting OpenAPI V3 AggregationController
	I0731 17:05:50.074783       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0731 17:05:50.074947       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:05:50.086677       1 crdregistration_controller.go:111] Starting crd-autoregister controller
	I0731 17:05:50.086708       1 shared_informer.go:313] Waiting for caches to sync for crd-autoregister
	I0731 17:05:50.117864       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0731 17:05:50.122120       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:05:50.122365       1 policy_source.go:224] refreshing policies
	I0731 17:05:50.132563       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0731 17:05:50.166384       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0731 17:05:50.168074       1 shared_informer.go:320] Caches are synced for configmaps
	I0731 17:05:50.168116       1 apf_controller.go:379] Running API Priority and Fairness config worker
	I0731 17:05:50.168122       1 apf_controller.go:382] Running API Priority and Fairness periodic rebalancing process
	I0731 17:05:50.170411       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0731 17:05:50.174248       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0731 17:05:50.178334       1 handler_discovery.go:447] Starting ResourceDiscoveryManager
	I0731 17:05:50.187980       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0731 17:05:50.188024       1 aggregator.go:165] initial CRD sync complete...
	I0731 17:05:50.188030       1 autoregister_controller.go:141] Starting autoregister controller
	I0731 17:05:50.188034       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0731 17:05:50.188038       1 cache.go:39] Caches are synced for autoregister controller
	E0731 17:05:50.205462       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0731 17:05:51.075340       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0731 17:06:47.219071       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0731 17:07:08.422863       1 controller.go:615] quota admission added evaluator for: endpoints
	
	
	==> kube-apiserver [aec44315311a] <==
	I0731 17:03:27.253147       1 options.go:221] external host was not specified, using 192.169.0.5
	I0731 17:03:27.253888       1 server.go:148] Version: v1.30.3
	I0731 17:03:27.253988       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:03:27.786353       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0731 17:03:27.788898       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:03:27.790619       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0731 17:03:27.790629       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0731 17:03:27.790778       1 instance.go:299] Using reconciler: lease
	W0731 17:03:47.786207       1 logging.go:59] [core] [Channel #1 SubChannel #3] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0731 17:03:47.786314       1 logging.go:59] [core] [Channel #2 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0731 17:03:47.791937       1 instance.go:292] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-controller-manager [0a6a6d756b8d] <==
	I0731 17:05:30.561595       1 serving.go:380] Generated self-signed cert in-memory
	I0731 17:05:31.250391       1 controllermanager.go:189] "Starting" version="v1.30.3"
	I0731 17:05:31.250471       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:05:31.252077       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0731 17:05:31.252281       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:05:31.252444       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:05:31.254793       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0731 17:05:51.257636       1 controllermanager.go:234] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: an error on the server (\"[+]ping ok\\n[+]log ok\\n[+]etcd ok\\n[+]poststarthook/start-apiserver-admission-initializer ok\\n[+]poststarthook/generic-apiserver-start-informers ok\\n[+]poststarthook/priority-and-fairness-config-consumer ok\\n[+]poststarthook/priority-and-fairness-filter ok\\n[+]poststarthook/storage-object-count-tracker-hook ok\\n[+]poststarthook/start-apiextensions-informers ok\\n[+]poststarthook/start-apiextensions-controllers ok\\n[+]poststarthook/crd-informer-synced ok\\n[+]poststarthook/start-service-ip-repair-controllers ok\\n[-]poststarthook/rbac/bootstrap-roles failed: reason withheld\\n[+]poststarthook/scheduling/bootstrap-system-priority-classes ok\\n[+]poststarthook/priority-and-fairness-config-producer ok\\n[+]poststarthook/start-system-namespaces-controller
ok\\n[+]poststarthook/bootstrap-controller ok\\n[+]poststarthook/start-cluster-authentication-info-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok\\n[+]poststarthook/start-legacy-token-tracking-controller ok\\n[+]poststarthook/aggregator-reload-proxy-client-cert ok\\n[+]poststarthook/start-kube-aggregator-informers ok\\n[+]poststarthook/apiservice-registration-controller ok\\n[+]poststarthook/apiservice-status-available-controller ok\\n[+]poststarthook/apiservice-discovery-controller ok\\n[+]poststarthook/kube-apiserver-autoregistration ok\\n[+]autoregister-completion ok\\n[+]poststarthook/apiservice-openapi-controller ok\\n[+]poststarthook/apiservice-openapiv3-controller ok\\nhealthz check failed\") has prevented the request from succeeding"
	
	
	==> kube-controller-manager [42b34888f43b] <==
	I0731 17:06:12.895845       1 shared_informer.go:320] Caches are synced for ReplicationController
	I0731 17:06:12.903208       1 shared_informer.go:320] Caches are synced for GC
	I0731 17:06:12.903274       1 shared_informer.go:320] Caches are synced for taint-eviction-controller
	I0731 17:06:12.920443       1 shared_informer.go:320] Caches are synced for endpoint_slice
	I0731 17:06:12.952902       1 shared_informer.go:320] Caches are synced for endpoint_slice_mirroring
	I0731 17:06:12.964558       1 shared_informer.go:320] Caches are synced for resource quota
	I0731 17:06:13.012295       1 shared_informer.go:320] Caches are synced for resource quota
	I0731 17:06:13.022225       1 shared_informer.go:320] Caches are synced for ClusterRoleAggregator
	I0731 17:06:13.501091       1 shared_informer.go:320] Caches are synced for garbage collector
	I0731 17:06:13.558892       1 shared_informer.go:320] Caches are synced for garbage collector
	I0731 17:06:13.559095       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0731 17:06:26.973668       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="55.100255ms"
	I0731 17:06:26.975840       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="54.971µs"
	I0731 17:06:29.221856       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="14.898144ms"
	I0731 17:06:29.222046       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="66.05µs"
	I0731 17:06:47.214265       1 endpointslice_controller.go:311] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mvtct\": the object has been modified; please apply your changes to the latest version and try again"
	I0731 17:06:47.214807       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"a0a921f4-5219-42ca-94c6-a4038d9ff710", APIVersion:"v1", ResourceVersion:"259", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mvtct": the object has been modified; please apply your changes to the latest version and try again
	I0731 17:06:47.241205       1 endpointslice_controller.go:311] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mvtct\": the object has been modified; please apply your changes to the latest version and try again"
	I0731 17:06:47.241526       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="78.18352ms"
	I0731 17:06:47.241539       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"a0a921f4-5219-42ca-94c6-a4038d9ff710", APIVersion:"v1", ResourceVersion:"259", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mvtct": the object has been modified; please apply your changes to the latest version and try again
	E0731 17:06:47.241671       1 replica_set.go:557] sync "kube-system/coredns-7db6d8ff4d" failed with Operation cannot be fulfilled on replicasets.apps "coredns-7db6d8ff4d": the object has been modified; please apply your changes to the latest version and try again
	I0731 17:06:47.242012       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="316.596µs"
	I0731 17:06:47.246958       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="100.8µs"
	I0731 17:06:47.288893       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="32.237881ms"
	I0731 17:06:47.289070       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.102µs"
	
	
	==> kube-proxy [4f56054bbee1] <==
	I0731 17:06:08.426782       1 server_linux.go:69] "Using iptables proxy"
	I0731 17:06:08.446564       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 17:06:08.497695       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 17:06:08.497829       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 17:06:08.497985       1 server_linux.go:165] "Using iptables Proxier"
	I0731 17:06:08.502095       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 17:06:08.503040       1 server.go:872] "Version info" version="v1.30.3"
	I0731 17:06:08.503116       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:06:08.506909       1 config.go:192] "Starting service config controller"
	I0731 17:06:08.507443       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 17:06:08.507578       1 config.go:319] "Starting node config controller"
	I0731 17:06:08.507600       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 17:06:08.509126       1 config.go:101] "Starting endpoint slice config controller"
	I0731 17:06:08.509154       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 17:06:08.607797       1 shared_informer.go:320] Caches are synced for node config
	I0731 17:06:08.607880       1 shared_informer.go:320] Caches are synced for service config
	I0731 17:06:08.610417       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [a34d35a3b612] <==
	I0731 17:05:30.706492       1 serving.go:380] Generated self-signed cert in-memory
	W0731 17:05:41.405023       1 authentication.go:368] Error looking up in-cluster authentication configuration: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": net/http: TLS handshake timeout
	W0731 17:05:41.405046       1 authentication.go:369] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0731 17:05:41.405050       1 authentication.go:370] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0731 17:05:50.110697       1 server.go:154] "Starting Kubernetes Scheduler" version="v1.30.3"
	I0731 17:05:50.110745       1 server.go:156] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:05:50.118585       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0731 17:05:50.120054       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0731 17:05:50.120091       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0731 17:05:50.120106       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:05:50.221789       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kube-scheduler [d088fefe5f8e] <==
	E0731 17:04:26.658553       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:28.887716       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:28.887806       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:32.427417       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:32.427586       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:36.436787       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:36.436870       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:40.022061       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:40.022227       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:43.471012       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:43.471291       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:43.930296       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:43.930321       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:44.041999       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:44.042358       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:48.230649       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:48.230983       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:58.373439       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:58.373554       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:05:00.249019       1 server.go:214] "waiting for handlers to sync" err="context canceled"
	I0731 17:05:00.249450       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	I0731 17:05:00.249577       1 secure_serving.go:258] Stopped listening on 127.0.0.1:10259
	E0731 17:05:00.249641       1 shared_informer.go:316] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0731 17:05:00.249670       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	E0731 17:05:00.249984       1 run.go:74] "command failed" err="finished without leader elect"
	
	
	==> kubelet <==
	Jul 31 17:06:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:06:38 ha-393000 kubelet[1592]: I0731 17:06:38.567240    1592 scope.go:117] "RemoveContainer" containerID="6d966e37d361871f946979a92770e4f4459ed0d5ff621124310f7ec91474bd95"
	Jul 31 17:06:38 ha-393000 kubelet[1592]: I0731 17:06:38.567467    1592 scope.go:117] "RemoveContainer" containerID="c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381"
	Jul 31 17:06:38 ha-393000 kubelet[1592]: E0731 17:06:38.567576    1592 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(a59b97ca-f030-4c73-b4db-00b444d39095)\"" pod="kube-system/storage-provisioner" podUID="a59b97ca-f030-4c73-b4db-00b444d39095"
	Jul 31 17:06:50 ha-393000 kubelet[1592]: I0731 17:06:50.883078    1592 scope.go:117] "RemoveContainer" containerID="c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381"
	Jul 31 17:07:22 ha-393000 kubelet[1592]: E0731 17:07:22.903115    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:07:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:07:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:07:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:07:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:08:22 ha-393000 kubelet[1592]: E0731 17:08:22.903462    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:08:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:09:22 ha-393000 kubelet[1592]: E0731 17:09:22.903125    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:09:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:10:22 ha-393000 kubelet[1592]: E0731 17:10:22.903858    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:10:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:261: (dbg) Run:  kubectl --context ha-393000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/RestartCluster FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/RestartCluster (375.27s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterClusterRestart (4.93s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
ha_test.go:413: expected profile "ha-393000" in json of 'profile list' to have "Degraded" status but have "HAppy" status. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-393000\",\"Status\":\"HAppy\",\"Config\":{\"Name\":\"ha-393000\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"hyperkit\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACount\":
1,\"APIServerPort\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.3\",\"ClusterName\":\"ha-393000\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.169.0.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"docker\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.169.0.5\",\"Port\":8443,\"Kubern
etesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.169.0.6\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.169.0.7\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m04\",\"IP\":\"192.169.0.8\",\"Port\":0,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":false,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":fal
se,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-device-plugin\":false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/Users:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\
"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOptimizations\":false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-darwin-amd64 profile list --output json"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:244: <<< TestMultiControlPlane/serial/DegradedAfterClusterRestart FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/DegradedAfterClusterRestart]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (3.604310066s)
helpers_test.go:252: TestMultiControlPlane/serial/DegradedAfterClusterRestart logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node start m02 -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:59 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000 -v=7               | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-393000 -v=7                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT | 31 Jul 24 10:00 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	| node    | ha-393000 node delete m03 -v=7       | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | ha-393000 stop -v=7                  | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT | 31 Jul 24 10:05 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true             | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:05 PDT |                     |
	|         | -v=7 --alsologtostderr               |           |         |         |                     |                     |
	|         | --driver=hyperkit                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 10:05:02
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 10:05:02.368405    3827 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:05:02.368654    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368660    3827 out.go:304] Setting ErrFile to fd 2...
	I0731 10:05:02.368664    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368853    3827 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:05:02.370244    3827 out.go:298] Setting JSON to false
	I0731 10:05:02.392379    3827 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2072,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:05:02.392490    3827 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:05:02.414739    3827 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:05:02.457388    3827 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:05:02.457417    3827 notify.go:220] Checking for updates...
	I0731 10:05:02.499271    3827 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:02.520330    3827 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:05:02.541352    3827 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:05:02.562183    3827 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:05:02.583467    3827 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:05:02.605150    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:02.605829    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.605892    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.615374    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51985
	I0731 10:05:02.615746    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.616162    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.616171    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.616434    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.616563    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.616815    3827 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:05:02.617053    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.617075    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.625506    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51987
	I0731 10:05:02.625873    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.626205    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.626218    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.626409    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.626526    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.655330    3827 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:05:02.697472    3827 start.go:297] selected driver: hyperkit
	I0731 10:05:02.697517    3827 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclas
s:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersio
n:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.697705    3827 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:05:02.697830    3827 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.698011    3827 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:05:02.707355    3827 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:05:02.711327    3827 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.711347    3827 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:05:02.714056    3827 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:05:02.714115    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:02.714124    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:02.714208    3827 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.714310    3827 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.756588    3827 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:05:02.778505    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:02.778576    3827 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:05:02.778606    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:02.778797    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:02.778816    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:02.779007    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.779936    3827 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:02.780056    3827 start.go:364] duration metric: took 96.562µs to acquireMachinesLock for "ha-393000"
	I0731 10:05:02.780090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:02.780107    3827 fix.go:54] fixHost starting: 
	I0731 10:05:02.780518    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.780547    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.789537    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51989
	I0731 10:05:02.789941    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.790346    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.790360    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.790582    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.790683    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.790784    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:02.790882    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.790960    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:05:02.791917    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 3685 missing from process table
	I0731 10:05:02.791950    3827 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:05:02.791969    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:05:02.792054    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:02.834448    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:05:02.857592    3827 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:05:02.857865    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.857903    3827 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:05:02.857999    3827 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:05:02.972788    3827 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:05:02.972822    3827 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:02.973002    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973031    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973095    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:02.973143    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:02.973162    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:02.974700    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Pid is 3840
	I0731 10:05:02.975089    3827 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:05:02.975104    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.975174    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:02.977183    3827 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:05:02.977235    3827 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:02.977252    3827 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66aa6ebd}
	I0731 10:05:02.977264    3827 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:05:02.977271    3827 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:05:02.977358    3827 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:05:02.978043    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:02.978221    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.978639    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:02.978649    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.978783    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:02.978867    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:02.978959    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979081    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979169    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:02.979279    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:02.979484    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:02.979495    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:02.982358    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:03.035630    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:03.036351    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.036364    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.036371    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.036377    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.417037    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:03.417051    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:03.531673    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.531715    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.531732    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.531747    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.532606    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:03.532629    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:09.110387    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:09.110442    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:09.110451    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:09.135557    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:12.964386    3827 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:05:16.034604    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:16.034620    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034750    3827 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:05:16.034759    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034882    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.034984    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.035084    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035183    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035281    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.035421    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.035570    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.035579    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:05:16.113215    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:05:16.113236    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.113381    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.113518    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113636    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113755    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.113885    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.114075    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.114086    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:16.184090    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:16.184121    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:16.184150    3827 buildroot.go:174] setting up certificates
	I0731 10:05:16.184163    3827 provision.go:84] configureAuth start
	I0731 10:05:16.184170    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.184309    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:16.184430    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.184520    3827 provision.go:143] copyHostCerts
	I0731 10:05:16.184558    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184631    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:16.184638    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184770    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:16.184969    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185016    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:16.185020    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185099    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:16.185248    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185290    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:16.185295    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185376    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:16.185533    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:05:16.315363    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:16.315421    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:16.315435    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.315558    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.315655    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.315746    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.315837    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:16.355172    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:16.355248    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:16.374013    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:16.374082    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:05:16.392556    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:16.392614    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:16.411702    3827 provision.go:87] duration metric: took 227.524882ms to configureAuth
	I0731 10:05:16.411715    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:16.411879    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:16.411893    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:16.412059    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.412155    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.412231    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412316    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412388    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.412496    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.412621    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.412628    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:16.477022    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:16.477033    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:16.477102    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:16.477118    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.477251    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.477356    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477432    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477517    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.477641    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.477778    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.477823    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:16.554633    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:16.554652    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.554788    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.554883    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.554976    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.555060    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.555183    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.555333    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.555346    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:18.220571    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:18.220585    3827 machine.go:97] duration metric: took 15.241941013s to provisionDockerMachine
	I0731 10:05:18.220598    3827 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:05:18.220606    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:18.220616    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.220842    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:18.220863    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.220962    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.221049    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.221130    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.221229    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.266644    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:18.270380    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:18.270395    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:18.270494    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:18.270687    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:18.270693    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:18.270912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:18.279363    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:18.313374    3827 start.go:296] duration metric: took 92.765768ms for postStartSetup
	I0731 10:05:18.313403    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.313592    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:18.313611    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.313704    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.313791    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.313881    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.313968    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.352727    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:18.352783    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:18.406781    3827 fix.go:56] duration metric: took 15.626681307s for fixHost
	I0731 10:05:18.406809    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.406951    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.407051    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407152    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407242    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.407364    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:18.407503    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:18.407510    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:05:18.475125    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445518.591979627
	
	I0731 10:05:18.475138    3827 fix.go:216] guest clock: 1722445518.591979627
	I0731 10:05:18.475144    3827 fix.go:229] Guest: 2024-07-31 10:05:18.591979627 -0700 PDT Remote: 2024-07-31 10:05:18.406799 -0700 PDT m=+16.073052664 (delta=185.180627ms)
	I0731 10:05:18.475163    3827 fix.go:200] guest clock delta is within tolerance: 185.180627ms
	I0731 10:05:18.475167    3827 start.go:83] releasing machines lock for "ha-393000", held for 15.69510158s
	I0731 10:05:18.475186    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475358    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:18.475493    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475894    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476002    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476070    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:18.476101    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476134    3827 ssh_runner.go:195] Run: cat /version.json
	I0731 10:05:18.476146    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476186    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476210    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476297    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476335    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476385    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476425    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476484    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.476507    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.560719    3827 ssh_runner.go:195] Run: systemctl --version
	I0731 10:05:18.565831    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:05:18.570081    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:18.570125    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:18.582480    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:18.582493    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.582597    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.598651    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:18.607729    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:18.616451    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:18.616493    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:18.625351    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.634238    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:18.643004    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.651930    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:18.660791    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:18.669545    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:18.678319    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:18.687162    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:18.695297    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:18.703279    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:18.796523    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:18.814363    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.814439    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:18.827366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.839312    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:18.855005    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.866218    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.877621    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:18.902460    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.913828    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.928675    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:18.931574    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:18.939501    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:18.952896    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:19.047239    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:19.144409    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:19.144484    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:19.159518    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:19.256187    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:21.607075    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.350869373s)
	I0731 10:05:21.607140    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:21.618076    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:21.632059    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.642878    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:21.739846    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:21.840486    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:21.956403    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:21.971397    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.982152    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.074600    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:22.139737    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:22.139811    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:22.144307    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:22.144354    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:22.147388    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:22.177098    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:22.177167    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.195025    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.255648    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:22.255698    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:22.256066    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:22.260342    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.270020    3827 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:05:22.270145    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:22.270198    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.283427    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.283451    3827 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:05:22.283523    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.296364    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.296384    3827 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:05:22.296395    3827 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:05:22.296485    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:22.296554    3827 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:05:22.333611    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:22.333625    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:22.333642    3827 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:05:22.333657    3827 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:05:22.333735    3827 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:05:22.333754    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:22.333805    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:22.346453    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:22.346520    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:22.346575    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:22.354547    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:22.354585    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:05:22.361938    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:05:22.375252    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:22.388755    3827 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:05:22.402335    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:22.415747    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:22.418701    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.428772    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.517473    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:22.532209    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:05:22.532222    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:22.532233    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:22.532416    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:22.532495    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:22.532505    3827 certs.go:256] generating profile certs ...
	I0731 10:05:22.532617    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:22.532703    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:05:22.532784    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:22.532791    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:22.532813    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:22.532832    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:22.532850    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:22.532866    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:22.532896    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:22.532925    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:22.532949    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:22.533054    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:22.533101    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:22.533110    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:22.533142    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:22.533177    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:22.533206    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:22.533274    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:22.533306    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.533327    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.533344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.533765    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:22.562933    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:22.585645    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:22.608214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:22.634417    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:22.664309    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:22.693214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:22.749172    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:22.798119    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:22.837848    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:22.862351    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:22.887141    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:05:22.900789    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:22.904988    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:22.914154    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917542    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917577    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.921712    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:22.930986    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:22.940208    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943536    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943573    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.947845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:22.957024    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:22.965988    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969319    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969351    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.973794    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:22.982944    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:22.986290    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:22.990544    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:22.994707    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:22.999035    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:23.003364    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:23.007486    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:23.011657    3827 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:23.011769    3827 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:05:23.024287    3827 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:05:23.032627    3827 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:05:23.032639    3827 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:05:23.032681    3827 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:05:23.040731    3827 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:05:23.041056    3827 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.041141    3827 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:05:23.041332    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.041968    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.042168    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:05:23.042482    3827 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:05:23.042638    3827 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:05:23.050561    3827 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:05:23.050575    3827 kubeadm.go:597] duration metric: took 17.931942ms to restartPrimaryControlPlane
	I0731 10:05:23.050580    3827 kubeadm.go:394] duration metric: took 38.928464ms to StartCluster
	I0731 10:05:23.050588    3827 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.050661    3827 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.051035    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.051268    3827 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:23.051280    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:05:23.051290    3827 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:05:23.051393    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.095938    3827 out.go:177] * Enabled addons: 
	I0731 10:05:23.116914    3827 addons.go:510] duration metric: took 65.60253ms for enable addons: enabled=[]
	I0731 10:05:23.116954    3827 start.go:246] waiting for cluster config update ...
	I0731 10:05:23.116965    3827 start.go:255] writing updated cluster config ...
	I0731 10:05:23.138605    3827 out.go:177] 
	I0731 10:05:23.160466    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.160597    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.182983    3827 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:05:23.224869    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:23.224904    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:23.225104    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:23.225125    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:23.225250    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.226256    3827 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:23.226360    3827 start.go:364] duration metric: took 80.549µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:05:23.226385    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:23.226394    3827 fix.go:54] fixHost starting: m02
	I0731 10:05:23.226804    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:23.226838    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:23.236394    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52012
	I0731 10:05:23.236756    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:23.237106    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:23.237125    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:23.237342    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:23.237473    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.237574    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:05:23.237669    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.237738    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:05:23.238671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.238732    3827 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:05:23.238750    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:05:23.238834    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:23.260015    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:05:23.302032    3827 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:05:23.302368    3827 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:05:23.302393    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.304220    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.304235    3827 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3703 is in state "Stopped"
	I0731 10:05:23.304257    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:05:23.304590    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:05:23.331752    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:05:23.331774    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:23.331901    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331928    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331992    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:23.332030    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:23.332051    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:23.333566    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Pid is 3849
	I0731 10:05:23.333951    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:05:23.333966    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.334032    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3849
	I0731 10:05:23.335680    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:05:23.335745    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:23.335779    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:05:23.335790    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbf52}
	I0731 10:05:23.335796    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:05:23.335803    3827 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:05:23.335842    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:05:23.336526    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:23.336703    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.337199    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:23.337210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.337324    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:23.337431    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:23.337536    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337761    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:23.337898    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:23.338051    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:23.338058    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:23.341501    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:23.350236    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:23.351301    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.351321    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.351333    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.351364    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.736116    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:23.736132    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:23.851173    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.851191    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.851204    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.851217    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.852083    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:23.852399    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:29.408102    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:29.408171    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:29.408180    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:29.431671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:34.400446    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:34.400461    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400584    3827 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:05:34.400595    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400705    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.400796    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.400890    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.400963    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.401039    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.401181    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.401327    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.401336    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:05:34.470038    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:05:34.470053    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.470199    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.470327    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470407    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470489    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.470615    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.470762    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.470773    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:34.535872    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:34.535890    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:34.535899    3827 buildroot.go:174] setting up certificates
	I0731 10:05:34.535905    3827 provision.go:84] configureAuth start
	I0731 10:05:34.535911    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.536042    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:34.536141    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.536239    3827 provision.go:143] copyHostCerts
	I0731 10:05:34.536274    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536323    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:34.536328    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536441    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:34.536669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536701    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:34.536706    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536812    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:34.536958    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.536987    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:34.536992    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.537061    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:34.537222    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:05:34.648982    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:34.649040    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:34.649057    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.649198    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.649295    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.649402    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.649489    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:34.683701    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:34.683772    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:34.703525    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:34.703596    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:34.722548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:34.722624    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:05:34.742309    3827 provision.go:87] duration metric: took 206.391288ms to configureAuth
	I0731 10:05:34.742322    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:34.742483    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:34.742496    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:34.742630    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.742723    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.742814    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742903    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742982    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.743099    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.743260    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.743269    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:34.800092    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:34.800106    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:34.800191    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:34.800203    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.800330    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.800415    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800506    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800591    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.800702    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.800838    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.800885    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:34.869190    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:34.869210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.869342    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.869439    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869544    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869626    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.869780    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.869920    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.869935    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:36.520454    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:36.520469    3827 machine.go:97] duration metric: took 13.183263325s to provisionDockerMachine
	I0731 10:05:36.520479    3827 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:05:36.520499    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:36.520508    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.520691    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:36.520702    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.520789    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.520884    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.520979    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.521066    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.561300    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:36.564926    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:36.564938    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:36.565027    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:36.565170    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:36.565176    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:36.565342    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:36.574123    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:36.603284    3827 start.go:296] duration metric: took 82.788869ms for postStartSetup
	I0731 10:05:36.603307    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.603494    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:36.603509    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.603613    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.603706    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.603803    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.603903    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.639240    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:36.639297    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:36.692559    3827 fix.go:56] duration metric: took 13.466165097s for fixHost
	I0731 10:05:36.692585    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.692728    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.692817    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692901    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692991    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.693111    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:36.693255    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:36.693263    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:05:36.752606    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445536.868457526
	
	I0731 10:05:36.752619    3827 fix.go:216] guest clock: 1722445536.868457526
	I0731 10:05:36.752626    3827 fix.go:229] Guest: 2024-07-31 10:05:36.868457526 -0700 PDT Remote: 2024-07-31 10:05:36.692574 -0700 PDT m=+34.358830009 (delta=175.883526ms)
	I0731 10:05:36.752636    3827 fix.go:200] guest clock delta is within tolerance: 175.883526ms
	I0731 10:05:36.752640    3827 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.526270601s
	I0731 10:05:36.752657    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.752793    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:36.777379    3827 out.go:177] * Found network options:
	I0731 10:05:36.798039    3827 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:05:36.819503    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.819540    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820385    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820770    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:36.820818    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:05:36.820878    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.820996    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:05:36.821009    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821024    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.821247    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821250    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821474    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821525    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821664    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.821739    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821918    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:05:36.854335    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:36.854406    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:36.901302    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:36.901324    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:36.901422    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:36.917770    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:36.926621    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:36.935218    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:36.935259    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:36.943879    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.952873    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:36.961710    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.970281    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:36.979176    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:36.987922    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:36.996548    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:37.005349    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:37.013281    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:37.020977    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.118458    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:37.137862    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:37.137937    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:37.153588    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.167668    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:37.181903    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.192106    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.202268    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:37.223314    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.233629    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:37.248658    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:37.251547    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:37.258758    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:37.272146    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:37.371218    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:37.472623    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:37.472648    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:37.486639    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.587113    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:39.947283    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.360151257s)
	I0731 10:05:39.947347    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:39.958391    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:39.972060    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:39.983040    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:40.085475    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:40.202062    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.302654    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:40.316209    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:40.326252    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.418074    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:40.482758    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:40.482836    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:40.487561    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:40.487613    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:40.491035    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:40.518347    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:40.518420    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.537051    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.576384    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:40.597853    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:05:40.618716    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:40.618993    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:40.622501    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:40.631917    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:05:40.632085    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:40.632302    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.632324    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.640887    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52034
	I0731 10:05:40.641227    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.641546    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.641557    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.641784    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.641900    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:40.641993    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:40.642069    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:40.643035    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:05:40.643318    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.643340    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.651868    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52036
	I0731 10:05:40.652209    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.652562    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.652581    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.652781    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.652890    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:40.652982    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 10:05:40.652988    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:40.653003    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:40.653135    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:40.653190    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:40.653199    3827 certs.go:256] generating profile certs ...
	I0731 10:05:40.653301    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:40.653388    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.59c17652
	I0731 10:05:40.653436    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:40.653443    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:40.653468    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:40.653489    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:40.653510    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:40.653529    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:40.653548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:40.653566    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:40.653584    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:40.653667    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:40.653713    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:40.653722    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:40.653755    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:40.653790    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:40.653819    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:40.653897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:40.653931    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:40.653957    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:40.653976    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:40.654001    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:40.654103    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:40.654205    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:40.654295    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:40.654382    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:40.686134    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 10:05:40.689771    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:05:40.697866    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 10:05:40.700957    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:05:40.708798    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:05:40.711973    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:05:40.719794    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:05:40.722937    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:05:40.731558    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:05:40.734708    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:05:40.742535    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 10:05:40.745692    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:05:40.753969    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:40.774721    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:40.793621    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:40.813481    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:40.833191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:40.853099    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:40.872942    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:40.892952    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:40.912690    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:40.932438    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:40.952459    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:40.971059    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:05:40.984708    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:05:40.998235    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:05:41.011745    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:05:41.025144    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:05:41.038794    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:05:41.052449    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:05:41.066415    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:41.070679    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:41.078894    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082206    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082237    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.086362    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:41.094634    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:41.103040    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106511    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106559    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.110939    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:41.119202    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:41.127421    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130783    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.134845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:41.142958    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:41.146291    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:41.150662    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:41.154843    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:41.159061    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:41.163240    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:41.167541    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:41.171729    3827 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 10:05:41.171784    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:41.171806    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:41.171838    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:41.184093    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:41.184125    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:41.184181    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:41.191780    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:41.191825    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:05:41.199155    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:05:41.212419    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:41.225964    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:41.239859    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:41.242661    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:41.251855    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.345266    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.360525    3827 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:41.360751    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:41.382214    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:05:41.402932    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.525126    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.539502    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:41.539699    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:05:41.539742    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:05:41.539934    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:41.540009    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:41.540015    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:41.540022    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:41.540026    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.017427    3827 round_trippers.go:574] Response Status: 200 OK in 8477 milliseconds
	I0731 10:05:50.018648    3827 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 10:05:50.018662    3827 node_ready.go:38] duration metric: took 8.478709659s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:50.018668    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:05:50.018717    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:05:50.018723    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.018731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.018737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.028704    3827 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 10:05:50.043501    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.043562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:05:50.043568    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.043574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.043579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.049258    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.050015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.050025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.050031    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.050035    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.066794    3827 round_trippers.go:574] Response Status: 200 OK in 16 milliseconds
	I0731 10:05:50.067093    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.067103    3827 pod_ready.go:81] duration metric: took 23.584491ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067110    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067150    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:05:50.067155    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.067161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.067170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.072229    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.072653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.072662    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.072674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.072678    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076158    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.076475    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.076487    3827 pod_ready.go:81] duration metric: took 9.372147ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076494    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076536    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:05:50.076541    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.076547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076551    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079467    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.079849    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.079858    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.079866    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079871    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.086323    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:05:50.086764    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.086775    3827 pod_ready.go:81] duration metric: took 10.276448ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086782    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:05:50.086846    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.086852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.086861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.090747    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.091293    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:50.091301    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.091306    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.091310    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.093538    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.094155    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.094165    3827 pod_ready.go:81] duration metric: took 7.376399ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094171    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:05:50.094214    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.094220    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.094223    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.096892    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.219826    3827 request.go:629] Waited for 122.388601ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219867    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219876    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.219882    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.219887    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.222303    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.222701    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.222710    3827 pod_ready.go:81] duration metric: took 128.533092ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.222720    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.419341    3827 request.go:629] Waited for 196.517978ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419372    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419376    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.419382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.419386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.424561    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.619242    3827 request.go:629] Waited for 194.143472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619333    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619339    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.619346    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.619350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.622245    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.622550    3827 pod_ready.go:97] node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622563    3827 pod_ready.go:81] duration metric: took 399.836525ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	E0731 10:05:50.622570    3827 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622575    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.819353    3827 request.go:629] Waited for 196.739442ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819433    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.819438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.819447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.822809    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:51.019387    3827 request.go:629] Waited for 196.0195ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019480    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.019488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.019494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.021643    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.220184    3827 request.go:629] Waited for 96.247837ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220254    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220260    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.220266    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.220271    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.222468    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.419702    3827 request.go:629] Waited for 196.732028ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419735    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419739    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.419746    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.419749    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.422018    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.622851    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.622865    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.622870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.622873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.625570    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.818923    3827 request.go:629] Waited for 192.647007ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818971    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.818977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.818981    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.821253    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.123108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.123124    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.123133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.123137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.125336    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.220188    3827 request.go:629] Waited for 94.282602ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220295    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220306    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.220317    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.220325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.223136    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.623123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.623202    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.623217    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.623227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.626259    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:52.626893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.626903    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.626912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.626916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.628416    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:52.628799    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:53.124413    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.124432    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.124441    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.124446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.127045    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.127494    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.127501    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.127511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.127514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.129223    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:53.623065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.623121    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.623133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.623142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626047    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.626707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.626717    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.626725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626729    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.628447    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:54.123646    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.123761    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.123778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.123788    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.127286    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:54.128015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.128025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.128033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.128038    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.130101    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.623229    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.623244    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.623253    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.623266    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.625325    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.625780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.625788    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.625794    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.625798    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.627218    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.123298    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.123318    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.123329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.123334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.126495    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:55.127199    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.127207    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.127213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.127217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.128585    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.128968    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:55.623994    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.624008    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.624016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.624021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.626813    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:55.627329    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.627336    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.627342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.627345    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.628805    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.123118    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.123195    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.123210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.123231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.126276    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:56.126864    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.126872    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.126877    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.126881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.128479    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.623814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.623924    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.623942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.623953    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.626841    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:56.627450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.627457    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.627463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.627467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.628844    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.124173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.124250    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.124262    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.124287    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.127734    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:57.128370    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.128377    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.128383    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.128386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.130108    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.130481    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:57.624004    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.624033    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.624093    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.624103    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.627095    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:57.628522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.628533    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.628541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.628547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.630446    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.123493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.123505    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.123512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.123514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.125506    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.126108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.126116    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.126121    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.126124    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.127991    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.623114    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.623141    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.623216    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.626428    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:58.627173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.627181    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.627187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.627191    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.628749    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.123212    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.123231    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.123243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.123249    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.126584    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:59.127100    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.127110    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.127118    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.127123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.129080    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.624707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.624736    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.624808    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.624814    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.627710    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:59.628543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.628550    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.628556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.628560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.630077    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.630437    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:00.123863    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.123878    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.123885    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.123888    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.125761    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.126237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.126245    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.126251    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.126254    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.127937    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.623226    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.623240    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.623246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.623249    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625210    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.625691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.625699    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.625704    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.627280    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.124705    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.124804    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.124820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.124830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.127445    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.127933    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.127941    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.127947    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.127950    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.129462    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.623718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.623731    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.623736    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.623739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.625948    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.626336    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.626344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.626349    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.626352    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.627901    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.124021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.124081    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.124088    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.124092    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.125801    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.126187    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.126195    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.126200    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.126204    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.127656    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.127974    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:02.623206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.623222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.623232    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.626774    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:02.627381    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.627389    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.627395    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.627400    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.630037    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.122889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.122980    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.122991    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.122997    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.125539    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.125964    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.125972    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.125976    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.125991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.129847    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.623340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.623368    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.623379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.623386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.626892    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.627517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.627524    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.627530    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.627532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.629281    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.123967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:04.124007    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.124016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.124021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.126604    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.127104    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.127111    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.127116    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.127131    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.128806    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.129260    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.129268    3827 pod_ready.go:81] duration metric: took 13.506690115s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129277    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129312    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:04.129317    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.129323    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.129328    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.131506    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.131966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.131974    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.131980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.131984    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.133464    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.133963    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.133974    3827 pod_ready.go:81] duration metric: took 4.690553ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.133981    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.134013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:04.134018    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.134023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.134028    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.136093    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.136498    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:04.136506    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.136512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.136515    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.138480    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.138864    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.138874    3827 pod_ready.go:81] duration metric: took 4.887644ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138882    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138917    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:04.138922    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.138928    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.138932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.140760    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.141121    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.141129    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.141134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.141137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.143127    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.143455    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.143464    3827 pod_ready.go:81] duration metric: took 4.577275ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143471    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:04.143513    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.143519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.143523    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.145638    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.145987    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.145994    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.146000    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.146003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.147718    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.148046    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.148055    3827 pod_ready.go:81] duration metric: took 4.578507ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.148061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.325414    3827 request.go:629] Waited for 177.298505ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325544    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.325555    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.325563    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.328825    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.525753    3827 request.go:629] Waited for 196.338568ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.525828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.525836    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.529114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.529604    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.529616    3827 pod_ready.go:81] duration metric: took 381.550005ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.529625    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.724886    3827 request.go:629] Waited for 195.165832ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724931    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.724937    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.724942    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.726934    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.924942    3827 request.go:629] Waited for 197.623557ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924972    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924977    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.924984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.924987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.927056    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.927556    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.927566    3827 pod_ready.go:81] duration metric: took 397.934888ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.927572    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.124719    3827 request.go:629] Waited for 197.081968ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124767    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.124774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.124777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.126705    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.324036    3827 request.go:629] Waited for 196.854241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324136    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.324144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.324151    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.326450    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:05.326831    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.326840    3827 pod_ready.go:81] duration metric: took 399.263993ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.326854    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.525444    3827 request.go:629] Waited for 198.543186ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525484    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.525490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.525494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.527459    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.724382    3827 request.go:629] Waited for 196.465154ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724505    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.724516    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.724528    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.727650    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:05.728134    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.728147    3827 pod_ready.go:81] duration metric: took 401.285988ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.728155    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.925067    3827 request.go:629] Waited for 196.808438ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.925137    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.925147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.928198    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.125772    3827 request.go:629] Waited for 196.79397ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125895    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125907    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.125918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.125924    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.129114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.129535    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.129548    3827 pod_ready.go:81] duration metric: took 401.386083ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.129557    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.324601    3827 request.go:629] Waited for 194.995432ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.324729    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.324736    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.327699    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.524056    3827 request.go:629] Waited for 195.918056ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524164    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524175    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.524186    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.524192    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.527800    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.528245    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.528255    3827 pod_ready.go:81] duration metric: took 398.692914ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.528262    3827 pod_ready.go:38] duration metric: took 16.509588377s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:06.528282    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:06.528341    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:06.541572    3827 api_server.go:72] duration metric: took 25.181024878s to wait for apiserver process to appear ...
	I0731 10:06:06.541584    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:06.541605    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:06.544968    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:06.545011    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:06.545016    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.545023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.545027    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.545730    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:06.545799    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:06.545808    3827 api_server.go:131] duration metric: took 4.219553ms to wait for apiserver health ...
	I0731 10:06:06.545813    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:06.724899    3827 request.go:629] Waited for 179.053526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724936    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.724948    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.724951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.733411    3827 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 10:06:06.742910    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:06.742937    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:06.742945    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:06.742950    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:06.742953    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:06.742958    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:06.742961    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:06.742963    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:06.742966    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:06.742968    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:06.742971    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:06.742973    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:06.742977    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:06.742981    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:06.742984    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:06.742986    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:06.742989    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:06.742991    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:06.742995    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:06.742998    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:06.743001    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:06.743003    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Pending
	I0731 10:06:06.743006    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:06.743010    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:06.743012    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:06.743017    3827 system_pods.go:74] duration metric: took 197.200154ms to wait for pod list to return data ...
	I0731 10:06:06.743023    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:06.925020    3827 request.go:629] Waited for 181.949734ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925067    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.925076    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.925081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.927535    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.927730    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:06.927740    3827 default_sa.go:55] duration metric: took 184.712762ms for default service account to be created ...
	I0731 10:06:06.927745    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:07.125051    3827 request.go:629] Waited for 197.272072ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125090    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.125100    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.125104    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.129975    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:07.134630    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:07.134648    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134654    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134659    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:07.134663    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:07.134666    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:07.134671    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running / Ready:ContainersNotReady (containers with unready status: [kindnet-cni]) / ContainersReady:ContainersNotReady (containers with unready status: [kindnet-cni])
	I0731 10:06:07.134675    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:07.134679    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:07.134683    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:07.134705    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:07.134712    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:07.134718    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0731 10:06:07.134723    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:07.134728    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:07.134731    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:07.134735    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:07.134739    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0731 10:06:07.134743    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:07.134747    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:07.134751    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:07.134755    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:07.134764    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:07.134768    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:07.134772    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0731 10:06:07.134781    3827 system_pods.go:126] duration metric: took 207.030567ms to wait for k8s-apps to be running ...
	I0731 10:06:07.134786    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:07.134841    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:07.148198    3827 system_svc.go:56] duration metric: took 13.406485ms WaitForService to wait for kubelet
	I0731 10:06:07.148215    3827 kubeadm.go:582] duration metric: took 25.78766951s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:07.148230    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:07.324197    3827 request.go:629] Waited for 175.905806ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324232    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.324238    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.324243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.329946    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:07.330815    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330830    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330840    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330843    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330847    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330850    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330853    3827 node_conditions.go:105] duration metric: took 182.619551ms to run NodePressure ...
	I0731 10:06:07.330860    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:07.330878    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:07.352309    3827 out.go:177] 
	I0731 10:06:07.373528    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:07.373631    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.433500    3827 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 10:06:07.475236    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:07.475262    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:07.475398    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:07.475412    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:07.475498    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.476024    3827 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:07.476077    3827 start.go:364] duration metric: took 40.57µs to acquireMachinesLock for "ha-393000-m03"
	I0731 10:06:07.476090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:07.476095    3827 fix.go:54] fixHost starting: m03
	I0731 10:06:07.476337    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:07.476357    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:07.485700    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52041
	I0731 10:06:07.486069    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:07.486427    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:07.486449    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:07.486677    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:07.486797    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.486888    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:06:07.486969    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.487057    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:06:07.488010    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.488031    3827 fix.go:112] recreateIfNeeded on ha-393000-m03: state=Stopped err=<nil>
	I0731 10:06:07.488039    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	W0731 10:06:07.488129    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:07.525270    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m03" ...
	I0731 10:06:07.583189    3827 main.go:141] libmachine: (ha-393000-m03) Calling .Start
	I0731 10:06:07.583357    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.583398    3827 main.go:141] libmachine: (ha-393000-m03) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 10:06:07.584444    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.584457    3827 main.go:141] libmachine: (ha-393000-m03) DBG | pid 2994 is in state "Stopped"
	I0731 10:06:07.584473    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid...
	I0731 10:06:07.584622    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 10:06:07.614491    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 10:06:07.614519    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:07.614662    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614709    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614792    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:07.614841    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:07.614865    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:07.616508    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Pid is 3858
	I0731 10:06:07.617000    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 10:06:07.617017    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.617185    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 3858
	I0731 10:06:07.619558    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 10:06:07.619621    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:07.619647    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:07.619664    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:07.619685    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:07.619703    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:06:07.619712    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 10:06:07.619727    3827 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 10:06:07.619755    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 10:06:07.620809    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:07.621055    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.621590    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:07.621602    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.621745    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:07.621861    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:07.621957    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622061    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622150    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:07.622290    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:07.622460    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:07.622469    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:07.625744    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:07.635188    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:07.636453    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:07.636476    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:07.636488    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:07.636503    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.026194    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:08.026210    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:08.141380    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:08.141403    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:08.141420    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:08.141430    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.142228    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:08.142237    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:13.717443    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:13.717596    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:13.717612    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:13.741129    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:06:18.682578    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:06:18.682599    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682767    3827 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 10:06:18.682779    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682866    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.682981    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.683070    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683166    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683267    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.683412    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.683571    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.683581    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 10:06:18.749045    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 10:06:18.749064    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.749190    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.749278    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749369    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.749565    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.749706    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.749722    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:06:18.806865    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:06:18.806883    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:06:18.806892    3827 buildroot.go:174] setting up certificates
	I0731 10:06:18.806898    3827 provision.go:84] configureAuth start
	I0731 10:06:18.806904    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.807035    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:18.807129    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.807209    3827 provision.go:143] copyHostCerts
	I0731 10:06:18.807236    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807287    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:06:18.807293    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807440    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:06:18.807654    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807687    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:06:18.807691    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807798    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:06:18.807946    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.807978    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:06:18.807983    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.808051    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:06:18.808199    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 10:06:18.849388    3827 provision.go:177] copyRemoteCerts
	I0731 10:06:18.849440    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:06:18.849454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.849608    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.849706    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.849793    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.849878    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:18.882927    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:06:18.883001    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:06:18.902836    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:06:18.902904    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:06:18.922711    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:06:18.922778    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:06:18.943709    3827 provision.go:87] duration metric: took 136.803232ms to configureAuth
	I0731 10:06:18.943724    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:06:18.943896    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:18.943910    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:18.944075    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.944168    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.944245    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944342    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944422    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.944538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.944665    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.944672    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:06:18.996744    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:06:18.996756    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:06:18.996829    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:06:18.996840    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.996972    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.997082    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997171    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997252    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.997394    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.997538    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.997587    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:06:19.061774    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:06:19.061792    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:19.061924    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:19.062001    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062094    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062183    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:19.062322    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:19.062475    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:19.062487    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:06:20.667693    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:06:20.667709    3827 machine.go:97] duration metric: took 13.046112735s to provisionDockerMachine
	I0731 10:06:20.667718    3827 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 10:06:20.667725    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:06:20.667738    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.667939    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:06:20.667954    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.668063    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.668167    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.668260    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.668365    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.711043    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:06:20.714520    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:06:20.714533    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:06:20.714632    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:06:20.714782    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:06:20.714789    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:06:20.714971    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:06:20.725237    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:20.756197    3827 start.go:296] duration metric: took 88.463878ms for postStartSetup
	I0731 10:06:20.756221    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.756402    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:06:20.756417    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.756509    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.756594    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.756688    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.756757    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.788829    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:06:20.788889    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:06:20.841715    3827 fix.go:56] duration metric: took 13.365618842s for fixHost
	I0731 10:06:20.841743    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.841878    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.841982    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842069    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842155    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.842314    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:20.842486    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:20.842494    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:06:20.895743    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445580.896263750
	
	I0731 10:06:20.895763    3827 fix.go:216] guest clock: 1722445580.896263750
	I0731 10:06:20.895768    3827 fix.go:229] Guest: 2024-07-31 10:06:20.89626375 -0700 PDT Remote: 2024-07-31 10:06:20.841731 -0700 PDT m=+78.507993684 (delta=54.53275ms)
	I0731 10:06:20.895779    3827 fix.go:200] guest clock delta is within tolerance: 54.53275ms
	I0731 10:06:20.895783    3827 start.go:83] releasing machines lock for "ha-393000-m03", held for 13.419701289s
	I0731 10:06:20.895800    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.895930    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:20.933794    3827 out.go:177] * Found network options:
	I0731 10:06:21.008361    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 10:06:21.029193    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.029220    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.029239    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.029902    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030149    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030274    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:06:21.030303    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 10:06:21.030372    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.030402    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.030458    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030487    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:06:21.030508    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:21.030615    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030657    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030724    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030782    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030837    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030887    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:21.030941    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 10:06:21.060481    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:06:21.060548    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:06:21.113024    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:06:21.113039    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.113103    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.128523    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:06:21.136837    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:06:21.145325    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.145388    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:06:21.153686    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.162021    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:06:21.170104    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.178345    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:06:21.186720    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:06:21.195003    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:06:21.203212    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:06:21.211700    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:06:21.219303    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:06:21.226730    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.333036    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:06:21.355400    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.355468    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:06:21.370793    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.382599    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:06:21.397116    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.408366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.419500    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:06:21.441593    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.453210    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.468638    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:06:21.471686    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:06:21.480107    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:06:21.493473    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:06:21.590098    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:06:21.695002    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.695025    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:06:21.709644    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.804799    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:06:24.090859    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.286034061s)
	I0731 10:06:24.090921    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:06:24.102085    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:06:24.115631    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.125950    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:06:24.222193    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:06:24.332843    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.449689    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:06:24.463232    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.474652    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.567486    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:06:24.631150    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:06:24.631230    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:06:24.635708    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:06:24.635764    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:06:24.638929    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:06:24.666470    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:06:24.666542    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.686587    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.729344    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:06:24.771251    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:06:24.792172    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:06:24.813314    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:24.813703    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:06:24.818215    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:24.828147    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:06:24.828324    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:24.828531    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.828552    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.837259    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52063
	I0731 10:06:24.837609    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.837954    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.837967    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.838165    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.838272    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:06:24.838349    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:24.838424    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:06:24.839404    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:06:24.839647    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.839672    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.848293    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52065
	I0731 10:06:24.848630    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.848982    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.848999    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.849191    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.849297    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:06:24.849393    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 10:06:24.849399    3827 certs.go:194] generating shared ca certs ...
	I0731 10:06:24.849408    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:06:24.849551    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:06:24.849606    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:06:24.849615    3827 certs.go:256] generating profile certs ...
	I0731 10:06:24.849710    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:06:24.849799    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 10:06:24.849848    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:06:24.849860    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:06:24.849881    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:06:24.849901    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:06:24.849920    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:06:24.849937    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:06:24.849955    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:06:24.849974    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:06:24.849991    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:06:24.850072    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:06:24.850109    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:06:24.850118    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:06:24.850152    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:06:24.850184    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:06:24.850218    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:06:24.850285    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:24.850322    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:06:24.850344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:06:24.850366    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:24.850395    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:06:24.850485    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:06:24.850565    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:06:24.850653    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:06:24.850732    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:06:24.882529    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 10:06:24.886785    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:06:24.896598    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 10:06:24.900384    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:06:24.910269    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:06:24.914011    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:06:24.922532    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:06:24.925784    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:06:24.936850    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:06:24.940321    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:06:24.950026    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 10:06:24.953055    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:06:24.962295    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:06:24.982990    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:06:25.003016    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:06:25.022822    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:06:25.043864    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:06:25.064140    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:06:25.084546    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:06:25.105394    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:06:25.125890    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:06:25.146532    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:06:25.166742    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:06:25.186545    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:06:25.200206    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:06:25.214106    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:06:25.228037    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:06:25.242065    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:06:25.255847    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:06:25.269574    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:06:25.283881    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:06:25.288466    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:06:25.297630    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301289    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301331    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.305714    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:06:25.314348    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:06:25.322967    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326578    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326634    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.330926    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:06:25.339498    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:06:25.348151    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351535    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351576    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.355921    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:06:25.364535    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:06:25.368077    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:06:25.372428    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:06:25.376757    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:06:25.380980    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:06:25.385296    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:06:25.389606    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:06:25.393857    3827 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 10:06:25.393914    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:06:25.393928    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:06:25.393959    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:06:25.405786    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:06:25.405830    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:06:25.405888    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:06:25.414334    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:06:25.414379    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:06:25.422310    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:06:25.435970    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:06:25.449652    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:06:25.463392    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:06:25.466266    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:25.476391    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.572265    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.587266    3827 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:06:25.587454    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:25.609105    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:06:25.650600    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.776520    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.790838    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:06:25.791048    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:06:25.791095    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:06:25.791257    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.791299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:25.791305    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.791311    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.791315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.793351    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.793683    3827 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 10:06:25.793693    3827 node_ready.go:38] duration metric: took 2.426331ms for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.793700    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:25.793737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:25.793742    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.793753    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.793758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.797877    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:25.803934    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:25.803995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:25.804000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.804007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.804011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.806477    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.806997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:25.807005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.807011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.807014    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.808989    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.304983    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.304998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.305006    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.305010    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.307209    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:26.307839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.307846    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.307852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.307861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.309644    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.805493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.805510    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.805520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.805527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.821394    3827 round_trippers.go:574] Response Status: 200 OK in 15 milliseconds
	I0731 10:06:26.822205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.822215    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.822221    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.822224    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.827160    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:27.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.305839    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.305846    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308258    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.308744    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.308752    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.308758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308761    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.310974    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.805552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.805567    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.805574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.805578    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.807860    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.808403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.808410    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.808416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.808419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.810436    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.810811    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:28.305577    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.305593    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.305600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.305604    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.311583    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:28.312446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.312455    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.312461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.312465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.314779    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.804391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.804407    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.804414    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.804420    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.806848    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.807227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.807235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.807241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.807244    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.809171    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:29.305552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.305615    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.305624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.305629    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.308134    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.308891    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.308900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.308906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.308909    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.311098    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.805109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.805127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.805192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.805198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.807898    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.808285    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.808292    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.808297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.808300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.810154    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.305017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.305032    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.305045    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.305048    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.307205    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.307776    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.307783    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.307789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.307792    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.309771    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.310293    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:30.805366    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.805428    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.805436    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.805440    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.807864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.808309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.808316    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.808322    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.808325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.810111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.305667    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.305700    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.305708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.305712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308126    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:31.308539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.308546    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.308552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.310279    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.804975    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.805002    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.805014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.805020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.808534    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:31.809053    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.809061    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.809066    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.809069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.810955    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.304759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.304815    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.304830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.304839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.308267    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.308684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.308692    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.308698    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.308701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.310475    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.310804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:32.805138    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.805163    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.805175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.805181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.808419    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.809125    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.809133    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.809139    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.809143    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.810741    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.305088    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.305103    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.305109    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.305113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.307495    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.307998    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.308005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.308011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.308015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.309595    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.806000    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.806021    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.806049    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.806056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.808625    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.809248    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.809259    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.809264    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.809269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.810758    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.305752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.305832    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.305847    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.305853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.308868    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.309591    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.309599    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.309605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.309608    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.311263    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.311627    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:34.804923    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.804948    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.804959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.804965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.808036    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.808636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.808646    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.808654    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.808670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.810398    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.305879    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.305966    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.305982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.305991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.309016    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:35.309584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.309592    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.309598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.309601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.311155    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.804092    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.804107    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.804114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.804117    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.806476    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:35.806988    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.806997    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.807002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.807025    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.808897    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.305921    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.305943    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.305951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.305955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.308670    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:36.309170    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.309178    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.309184    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.309199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.310943    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.805015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.805085    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.805098    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.805106    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.808215    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:36.808810    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.808817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.808823    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.808827    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.810482    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.810768    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:37.305031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.305055    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.305068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.305077    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.308209    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:37.308934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.308942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.308947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.308951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.310514    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:37.805625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.805671    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.805682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.805687    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808188    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:37.808728    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.808735    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.808741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808744    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.810288    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.305838    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.305845    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.307926    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.308378    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.308386    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.308391    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.308395    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.310092    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.805380    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.805397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.805406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.805410    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.807819    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.808368    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.808376    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.808382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.808385    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.809904    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.305804    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.305820    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.305826    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.305830    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.307991    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.308527    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.308535    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.308541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.308546    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.310495    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.310929    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:39.806108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.806122    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.806129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.806132    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.808192    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.808709    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.808718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.808727    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.808730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.810476    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.304101    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.304125    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.304137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.304144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307004    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.307629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.307637    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.307643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.309373    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.804289    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.804302    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.804329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.804334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.806678    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.807320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.807328    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.807334    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.807338    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.809111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.305710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.305762    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.305770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.305774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.307795    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.308244    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.308252    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.308258    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.310033    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.805219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.805235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.805242    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.805246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.807574    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.808103    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.808112    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.808119    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.808123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.810305    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.810720    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:42.305509    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.305569    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.305580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.305586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.307774    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:42.308154    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.308161    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.308167    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.308170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.309895    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:42.804631    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.804655    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.804667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.804687    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.808080    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:42.808852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.808863    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.808869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.808874    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.811059    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.304116    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.304217    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.304233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.304239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.306879    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.307340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.307348    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.307354    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.307358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.308948    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.805920    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.805934    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.805981    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.805986    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.808009    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.808576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.808583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.808589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.808592    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.810282    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.810804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:44.304703    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.304728    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.304798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.304823    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.308376    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.308780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.308787    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.308793    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.308797    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.310396    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:44.805218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.805242    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.805255    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.805264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.808404    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.808967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.808978    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.808986    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.808990    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.810748    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.304672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.304770    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.304784    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.304791    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.307754    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:45.308249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.308256    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.308265    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.309903    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.804236    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.804265    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.804276    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.804281    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.807605    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:45.808214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.808222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.808228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.808231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.810076    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.305660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.305674    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.305723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.305727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.307959    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.308389    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.308397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.308403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.308406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.310188    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.310668    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:46.805585    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.805685    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.805700    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.805708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.808399    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.808892    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.808900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.808910    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.808914    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.810397    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.304911    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:47.304926    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.304933    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.304936    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.307282    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.307761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.307768    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.307774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.307777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.309541    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.309921    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.309931    3827 pod_ready.go:81] duration metric: took 21.505983976s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309937    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:06:47.309971    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.309977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.309980    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.311547    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.311995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.312003    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.312009    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.312013    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.313414    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.313802    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.313811    3827 pod_ready.go:81] duration metric: took 3.869093ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313818    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313850    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:06:47.313855    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.313861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.313865    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.315523    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.315938    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.315947    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.315955    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.315959    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.317522    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.317922    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.317931    3827 pod_ready.go:81] duration metric: took 4.10711ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317937    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:06:47.317976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.317982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.317985    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319520    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.319893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:47.319900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.319906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319909    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321439    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.321816    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.321825    3827 pod_ready.go:81] duration metric: took 3.88293ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321832    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321862    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:06:47.321867    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.321872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321876    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.323407    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.323756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:47.323763    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.323769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.323773    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.325384    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.325703    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.325712    3827 pod_ready.go:81] duration metric: took 3.875112ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.325727    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.505410    3827 request.go:629] Waited for 179.649549ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505454    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.505462    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.505467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.508003    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.705861    3827 request.go:629] Waited for 197.38651ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.705987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.705997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.708863    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.709477    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.709486    3827 pod_ready.go:81] duration metric: took 383.754198ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.709493    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.905743    3827 request.go:629] Waited for 196.205437ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905783    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905790    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.905812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.905826    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.908144    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.106945    3827 request.go:629] Waited for 198.217758ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106991    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.107017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.107023    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.109503    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.109889    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.109898    3827 pod_ready.go:81] duration metric: took 400.399458ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.109910    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.306502    3827 request.go:629] Waited for 196.553294ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.306589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.306593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.308907    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.506077    3827 request.go:629] Waited for 196.82354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506180    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.506189    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.506195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.508341    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.508805    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.508814    3827 pod_ready.go:81] duration metric: took 398.898513ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.508829    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.706656    3827 request.go:629] Waited for 197.780207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706753    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706765    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.706776    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.706784    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.709960    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.906621    3827 request.go:629] Waited for 195.987746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906714    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906726    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.906737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.906744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.910100    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.910537    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.910550    3827 pod_ready.go:81] duration metric: took 401.715473ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.910559    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.106125    3827 request.go:629] Waited for 195.518023ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106250    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106262    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.106273    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.106280    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.109411    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.306599    3827 request.go:629] Waited for 196.360989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306730    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.306741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.306747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.309953    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.310311    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.310320    3827 pod_ready.go:81] duration metric: took 399.753992ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.310327    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.505092    3827 request.go:629] Waited for 194.718659ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505129    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.505140    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.505144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.510347    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:49.706499    3827 request.go:629] Waited for 195.722594ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706547    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706556    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.706623    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.706634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.709639    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:49.710039    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.710049    3827 pod_ready.go:81] duration metric: took 399.716837ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.710061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.906378    3827 request.go:629] Waited for 196.280735ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906418    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.906425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.906442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.911634    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:50.106586    3827 request.go:629] Waited for 194.536585ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106637    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106652    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.106717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.106725    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.109661    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.110176    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.110189    3827 pod_ready.go:81] duration metric: took 400.121095ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.110197    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.306216    3827 request.go:629] Waited for 195.968962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306286    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.306291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.306301    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.308314    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.505180    3827 request.go:629] Waited for 196.336434ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505332    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.505344    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.505351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.508601    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.509059    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.509072    3827 pod_ready.go:81] duration metric: took 398.868353ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.509081    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.705014    3827 request.go:629] Waited for 195.886159ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.705144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.705151    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.708274    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.906912    3827 request.go:629] Waited for 198.179332ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906985    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906991    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.906997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.907002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.908938    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:50.909509    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.909519    3827 pod_ready.go:81] duration metric: took 400.431581ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.909525    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.106576    3827 request.go:629] Waited for 197.012349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106668    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.106677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.106682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.109021    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.305894    3827 request.go:629] Waited for 196.495089ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.305945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.306000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.306010    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.306018    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.308864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.309301    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.309311    3827 pod_ready.go:81] duration metric: took 399.779835ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.309324    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.504969    3827 request.go:629] Waited for 195.610894ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505066    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.505072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.505076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.507056    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:51.705447    3827 request.go:629] Waited for 197.942219ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705515    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.705522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.705527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.707999    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.708367    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.708379    3827 pod_ready.go:81] duration metric: took 399.049193ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.708391    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.906469    3827 request.go:629] Waited for 198.035792ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906531    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.906539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.906545    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.909082    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.105038    3827 request.go:629] Waited for 195.597271ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105087    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105095    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.105157    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.105168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.108049    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.108591    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:52.108604    3827 pod_ready.go:81] duration metric: took 400.204131ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:52.108615    3827 pod_ready.go:38] duration metric: took 26.314911332s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:52.108628    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:52.108680    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:52.120989    3827 api_server.go:72] duration metric: took 26.533695803s to wait for apiserver process to appear ...
	I0731 10:06:52.121002    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:52.121014    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:52.124310    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:52.124340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:52.124344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.124353    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.124358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.124912    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:52.124978    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:52.124989    3827 api_server.go:131] duration metric: took 3.981645ms to wait for apiserver health ...
	I0731 10:06:52.124994    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:52.305762    3827 request.go:629] Waited for 180.72349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305845    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305853    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.305861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.305872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.310548    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:52.315274    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:52.315286    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.315289    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.315292    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.315295    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.315298    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.315301    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.315303    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.315306    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.315311    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.315313    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.315316    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.315319    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.315322    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.315327    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.315330    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.315333    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.315335    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.315338    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.315341    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.315343    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.315346    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.315348    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.315350    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.315353    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.315358    3827 system_pods.go:74] duration metric: took 190.3593ms to wait for pod list to return data ...
	I0731 10:06:52.315363    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:52.505103    3827 request.go:629] Waited for 189.702061ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505178    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505187    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.505195    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.505199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.507558    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.507636    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:52.507644    3827 default_sa.go:55] duration metric: took 192.276446ms for default service account to be created ...
	I0731 10:06:52.507666    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:52.705427    3827 request.go:629] Waited for 197.710286ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705497    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.705519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.705526    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.711904    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:06:52.716760    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:52.716772    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.716777    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.716780    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.716783    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.716787    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.716790    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.716794    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.716798    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.716801    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.716805    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.716809    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.716813    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.716816    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.716819    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.716823    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.716827    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.716830    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.716833    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.716836    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.716854    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.716860    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.716864    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.716867    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.716871    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.716876    3827 system_pods.go:126] duration metric: took 209.203713ms to wait for k8s-apps to be running ...
	I0731 10:06:52.716881    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:52.716936    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:52.731223    3827 system_svc.go:56] duration metric: took 14.33545ms WaitForService to wait for kubelet
	I0731 10:06:52.731240    3827 kubeadm.go:582] duration metric: took 27.143948309s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:52.731255    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:52.906178    3827 request.go:629] Waited for 174.879721ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906213    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906218    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.906257    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.906264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.908378    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.909014    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909025    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909032    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909035    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909039    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909041    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909045    3827 node_conditions.go:105] duration metric: took 177.780993ms to run NodePressure ...
	I0731 10:06:52.909053    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:52.909067    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:52.931184    3827 out.go:177] 
	I0731 10:06:52.952773    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:52.952858    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:52.974676    3827 out.go:177] * Starting "ha-393000-m04" worker node in "ha-393000" cluster
	I0731 10:06:53.016553    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:53.016583    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:53.016766    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:53.016784    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:53.016901    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.017869    3827 start.go:360] acquireMachinesLock for ha-393000-m04: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:53.017982    3827 start.go:364] duration metric: took 90.107µs to acquireMachinesLock for "ha-393000-m04"
	I0731 10:06:53.018005    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:53.018013    3827 fix.go:54] fixHost starting: m04
	I0731 10:06:53.018399    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:53.018423    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:53.027659    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52069
	I0731 10:06:53.028033    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:53.028349    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:53.028359    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:53.028586    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:53.028695    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.028810    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:06:53.028891    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.028978    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 10:06:53.029947    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid 3095 missing from process table
	I0731 10:06:53.029967    3827 fix.go:112] recreateIfNeeded on ha-393000-m04: state=Stopped err=<nil>
	I0731 10:06:53.029982    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	W0731 10:06:53.030076    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:53.051730    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m04" ...
	I0731 10:06:53.093566    3827 main.go:141] libmachine: (ha-393000-m04) Calling .Start
	I0731 10:06:53.093954    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.094004    3827 main.go:141] libmachine: (ha-393000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid
	I0731 10:06:53.094113    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Using UUID 8a49f5e0-ba79-41ac-9a76-c032dc065628
	I0731 10:06:53.120538    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Generated MAC d2:d8:fb:1d:1:ee
	I0731 10:06:53.120559    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:53.120750    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120805    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120864    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "8a49f5e0-ba79-41ac-9a76-c032dc065628", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:53.120909    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 8a49f5e0-ba79-41ac-9a76-c032dc065628 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:53.120925    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:53.122259    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Pid is 3870
	I0731 10:06:53.122766    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 0
	I0731 10:06:53.122781    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.122872    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3870
	I0731 10:06:53.125179    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 10:06:53.125242    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:53.125254    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:06:53.125266    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:53.125273    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:53.125280    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:53.125287    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found match: d2:d8:fb:1d:1:ee
	I0731 10:06:53.125295    3827 main.go:141] libmachine: (ha-393000-m04) DBG | IP: 192.169.0.8
	I0731 10:06:53.125358    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetConfigRaw
	I0731 10:06:53.126014    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:06:53.126188    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.126707    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:53.126722    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.126959    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:06:53.127071    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:06:53.127158    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127274    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127389    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:06:53.127538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:53.127705    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:06:53.127713    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:53.131247    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:53.140131    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:53.141373    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.141406    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.141429    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.141447    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.528683    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:53.528699    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:53.643451    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.643474    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.643483    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.643491    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.644344    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:53.644357    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:59.241509    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:59.241622    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:59.241636    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:59.265250    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:07:04.190144    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:07:04.190159    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190326    3827 buildroot.go:166] provisioning hostname "ha-393000-m04"
	I0731 10:07:04.190338    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190427    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.190528    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.190617    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190711    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190826    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.190962    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.191110    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.191119    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m04 && echo "ha-393000-m04" | sudo tee /etc/hostname
	I0731 10:07:04.259087    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m04
	
	I0731 10:07:04.259102    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.259236    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.259339    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259439    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.259647    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.259797    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.259811    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:07:04.323580    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:07:04.323604    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:07:04.323616    3827 buildroot.go:174] setting up certificates
	I0731 10:07:04.323623    3827 provision.go:84] configureAuth start
	I0731 10:07:04.323630    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.323758    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:04.323858    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.323932    3827 provision.go:143] copyHostCerts
	I0731 10:07:04.323960    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324021    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:07:04.324027    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324150    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:07:04.324352    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324397    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:07:04.324402    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324482    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:07:04.324627    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324668    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:07:04.324674    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324752    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:07:04.324900    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m04 san=[127.0.0.1 192.169.0.8 ha-393000-m04 localhost minikube]
	I0731 10:07:04.518738    3827 provision.go:177] copyRemoteCerts
	I0731 10:07:04.518793    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:07:04.518809    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.518951    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.519038    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.519124    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.519202    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:04.553750    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:07:04.553834    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:07:04.574235    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:07:04.574311    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:07:04.594359    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:07:04.594433    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:07:04.614301    3827 provision.go:87] duration metric: took 290.6663ms to configureAuth
	I0731 10:07:04.614319    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:07:04.614509    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:04.614526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:04.614676    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.614777    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.614880    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.614987    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.615110    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.615236    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.615386    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.615394    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:07:04.672493    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:07:04.672505    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:07:04.672600    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:07:04.672612    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.672752    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.672835    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.672958    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.673042    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.673159    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.673303    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.673352    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:07:04.741034    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	Environment=NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:07:04.741052    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.741187    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.741288    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741387    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741494    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.741621    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.741755    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.741771    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:07:06.325916    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:07:06.325931    3827 machine.go:97] duration metric: took 13.199216588s to provisionDockerMachine
	I0731 10:07:06.325941    3827 start.go:293] postStartSetup for "ha-393000-m04" (driver="hyperkit")
	I0731 10:07:06.325948    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:07:06.325960    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.326146    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:07:06.326163    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.326257    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.326346    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.326438    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.326522    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.369998    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:07:06.375343    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:07:06.375359    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:07:06.375470    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:07:06.375663    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:07:06.375669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:07:06.375894    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:07:06.394523    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:06.415884    3827 start.go:296] duration metric: took 89.928396ms for postStartSetup
	I0731 10:07:06.415906    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.416074    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:07:06.416088    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.416193    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.416287    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.416381    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.416451    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.451487    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:07:06.451545    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:07:06.482558    3827 fix.go:56] duration metric: took 13.464545279s for fixHost
	I0731 10:07:06.482584    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.482724    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.482806    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482891    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482992    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.483122    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:06.483263    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:06.483270    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:07:06.539713    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445626.658160546
	
	I0731 10:07:06.539725    3827 fix.go:216] guest clock: 1722445626.658160546
	I0731 10:07:06.539731    3827 fix.go:229] Guest: 2024-07-31 10:07:06.658160546 -0700 PDT Remote: 2024-07-31 10:07:06.482574 -0700 PDT m=+124.148842929 (delta=175.586546ms)
	I0731 10:07:06.539746    3827 fix.go:200] guest clock delta is within tolerance: 175.586546ms
	I0731 10:07:06.539751    3827 start.go:83] releasing machines lock for "ha-393000-m04", held for 13.521760862s
	I0731 10:07:06.539766    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.539895    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:06.564336    3827 out.go:177] * Found network options:
	I0731 10:07:06.583958    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	W0731 10:07:06.605128    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605143    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605170    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605183    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605593    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605717    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605786    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:07:06.605816    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	W0731 10:07:06.605831    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605845    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605864    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605930    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:07:06.605931    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.605944    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.606068    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606081    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.606172    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606197    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606270    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606322    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.606369    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	W0731 10:07:06.638814    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:07:06.638878    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:07:06.685734    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:07:06.685752    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.685831    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:06.701869    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:07:06.710640    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:07:06.719391    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:07:06.719452    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:07:06.728151    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.736695    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:07:06.745525    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.754024    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:07:06.762489    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:07:06.770723    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:07:06.779179    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:07:06.787524    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:07:06.795278    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:07:06.802833    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:06.908838    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:07:06.929085    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.929153    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:07:06.946994    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.958792    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:07:06.977007    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.987118    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:06.998383    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:07:07.019497    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:07.030189    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:07.045569    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:07:07.048595    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:07:07.055870    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:07:07.070037    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:07:07.166935    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:07:07.272420    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:07:07.272447    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:07:07.286182    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:07.397807    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:07:09.678871    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.281044692s)
	I0731 10:07:09.678935    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:07:09.691390    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:07:09.706154    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:09.718281    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:07:09.818061    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:07:09.918372    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.020296    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:07:10.034132    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:10.045516    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.140924    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:07:10.198542    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:07:10.198622    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:07:10.202939    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:07:10.203007    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:07:10.206254    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:07:10.238107    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:07:10.238184    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.256129    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.301307    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:07:10.337880    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:07:10.396169    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:07:10.454080    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	I0731 10:07:10.491070    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:10.491478    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:07:10.496573    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:10.506503    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:07:10.506687    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:10.506931    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.506954    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.515949    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52091
	I0731 10:07:10.516322    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.516656    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.516668    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.516893    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.517004    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:07:10.517099    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:07:10.517181    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:07:10.518192    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:07:10.518454    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.518477    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.527151    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52093
	I0731 10:07:10.527586    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.527914    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.527931    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.528158    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.528268    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:07:10.528367    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.8
	I0731 10:07:10.528374    3827 certs.go:194] generating shared ca certs ...
	I0731 10:07:10.528388    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:07:10.528576    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:07:10.528655    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:07:10.528666    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:07:10.528692    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:07:10.528712    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:07:10.528731    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:07:10.528834    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:07:10.528887    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:07:10.528897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:07:10.528933    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:07:10.528968    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:07:10.529000    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:07:10.529077    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:10.529114    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.529135    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.529152    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.529176    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:07:10.550191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:07:10.570588    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:07:10.590746    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:07:10.611034    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:07:10.631281    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:07:10.651472    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:07:10.671880    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:07:10.676790    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:07:10.685541    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689430    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689496    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.694391    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:07:10.703456    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:07:10.712113    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715795    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.720285    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:07:10.728964    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:07:10.737483    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741091    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741135    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.745570    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:07:10.754084    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:07:10.757225    3827 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 10:07:10.757258    3827 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.30.3 docker false true} ...
	I0731 10:07:10.757327    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:07:10.757375    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.764753    3827 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 10:07:10.764797    3827 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 10:07:10.772338    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 10:07:10.772398    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:07:10.772434    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772437    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.780324    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 10:07:10.780354    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 10:07:10.780356    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 10:07:10.780369    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 10:07:10.799303    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.799462    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.842469    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 10:07:10.842511    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 10:07:11.478912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0731 10:07:11.486880    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:07:11.501278    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:07:11.515550    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:07:11.518663    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:11.528373    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.625133    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:11.645175    3827 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0731 10:07:11.645375    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:11.651211    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:07:11.692705    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.797111    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:12.534860    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:07:12.535084    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:07:12.535128    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:07:12.535291    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:07:12.535335    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:12.535339    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:12.535359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:12.535366    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:12.537469    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.035600    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.035613    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.035620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.035622    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.037811    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.536601    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.536621    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.536630    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.536636    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.539103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.035926    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.035943    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.035952    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.035957    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.038327    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.535691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.535719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.538107    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.538174    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:15.035707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.035739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.037991    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:15.535587    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.535602    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.535658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.535663    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.537787    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.035475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.035497    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.035550    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.035555    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.037882    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.536666    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.536687    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.536712    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.536719    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.538904    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:17.035473    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.035488    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.035495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.035498    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.037610    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:17.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.536074    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.536089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.536096    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.539102    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.035624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.035646    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.035652    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.037956    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.535491    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.535589    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.535603    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.535610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.538819    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:18.538965    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:19.036954    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.037007    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.037028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.037033    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.039345    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:19.536847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.536862    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.536870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.536873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.538820    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.037064    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.037079    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.037086    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.037089    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.038945    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.536127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.536138    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.536145    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.536150    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:21.036613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.036684    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.036695    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.036701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.039123    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:21.039186    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:21.536684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.536700    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.536705    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.536708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.538918    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:22.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.036736    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.036743    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.036746    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.038627    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:22.536686    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.536704    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.536714    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.536718    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.538549    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:23.036470    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.036482    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.036489    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.036494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.038533    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:23.535581    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.535639    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.535653    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.535667    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.539678    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:23.539740    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:24.036874    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.036948    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.036959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.036965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.039843    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:24.536241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.536307    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.536318    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.536323    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.538807    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.036279    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.036343    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.036356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.036362    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.038454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.535942    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.535954    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.535962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.535967    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.538068    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.036823    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.036838    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.036845    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.036848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.038942    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.039008    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:26.535480    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.535499    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.535533    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:27.036202    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.036213    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.036219    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.036222    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.038071    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:27.537206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.537226    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.537236    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.537248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.539573    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.036203    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.036217    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.036223    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.036225    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.038017    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:28.536971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.536988    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.536998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.537003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.539378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.539442    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:29.035655    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.035667    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.035673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.035676    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.037786    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:29.537109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.537124    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.537144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.539430    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:30.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.035905    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.035908    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.037803    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:30.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.535701    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.535718    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.539029    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:31.036151    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.036166    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.036175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.038532    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:31.038593    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:31.536698    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.536710    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.538484    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.037162    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.037178    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.037185    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.037188    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.039081    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.536065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.536085    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.536095    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.536099    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.538365    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.036492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.036513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.036523    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.036527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.038851    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.038919    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:33.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.535576    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.537575    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:34.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.036912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.036923    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.036932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.040173    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:34.535858    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:35.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.036670    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.036677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.036682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.038861    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:35.038930    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:35.535814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.535827    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.535835    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.535840    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.538360    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.035769    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.035785    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.038202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.535426    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.535438    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.535445    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.535449    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.537303    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:37.035456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.035470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.035479    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.035483    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.037630    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.536548    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.536562    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.536568    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.536572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.538659    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.538720    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:38.036407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.036421    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.036427    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.036432    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.038467    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:38.537359    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.537378    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.537387    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.537392    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.539892    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:39.036414    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.036470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.036495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:39.535817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.535832    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.535839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.537796    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.035880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.035896    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.035906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.037712    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:40.535492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.535523    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.535536    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.538475    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:41.035745    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.035758    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.035770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.035774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:41.535726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.535738    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.535744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.535747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.537897    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.036564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.036573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.039525    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.039600    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:42.535450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.535465    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.537399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:43.035576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.035592    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.035598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.035602    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:43.536787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.536822    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.536832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.536837    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.539146    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.036148    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.036161    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.036169    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.036173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.038382    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.536653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.536709    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.538695    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:44.538753    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:45.036650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.036662    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.036668    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.036672    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.038555    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:45.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.535582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.535590    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.538335    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.035712    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.035740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.038035    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.535534    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.535557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.535564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.537974    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:47.035871    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.035887    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.035893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.035897    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.037864    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:47.037931    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:47.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.535564    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.535570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.535573    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.537590    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:48.035461    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.035531    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.035539    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.035543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.037510    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:48.536520    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.536535    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.536541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.536544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.538561    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:49.035436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.035448    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.035454    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.035458    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.037204    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.535574    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.535592    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.535595    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.537443    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.537505    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:50.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.036547    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.036566    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.038478    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:50.536624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.536636    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.536642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.538734    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.036016    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.036035    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.036044    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.036049    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.038643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.536662    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.536677    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.536686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.536691    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.539033    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.539099    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:52.036475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.036490    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.036499    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.036503    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.038975    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:52.537013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.537034    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.537041    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.537045    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.539229    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.037093    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.037106    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.037113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.037117    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.039169    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.536468    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.536478    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.536486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.539425    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.539565    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:54.035597    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.035609    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.035615    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.037574    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:54.535484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.535503    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.535509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.537529    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.036258    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.036270    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.036277    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.036280    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.038186    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:55.536493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.536513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.536533    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.539517    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.539589    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:56.035565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.035586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.035599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.040006    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:07:56.536361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.536374    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.536380    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.536383    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.538540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:57.036446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.036544    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.036567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.039754    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:57.536620    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.536630    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.536637    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.536639    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.538482    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:58.036499    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.036518    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.036527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.036532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.039244    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:58.039325    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:58.537076    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.537105    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.537197    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.537204    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.539718    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:59.037046    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.037127    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.037142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.037149    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.040197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:59.536758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.536790    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.536798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.536802    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.538842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.035440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.035453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.035460    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.035463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.037506    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.536873    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.536895    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.536906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.536913    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.540041    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:00.540123    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:01.036175    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.036225    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.036239    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.036248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.039214    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:01.535960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.535973    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.535979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.535983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.538089    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.036835    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.036856    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.036875    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.039802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.536660    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.536667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.536670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.538840    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.036159    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.036181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.036184    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.038276    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.038354    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:03.536974    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.536990    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.536996    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.537000    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.538828    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:04.036300    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.036363    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.036391    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.038707    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:04.535718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.535737    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.535749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.538366    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.036299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.036316    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.036350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.036354    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.038568    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:05.535824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.535837    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.535846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.537780    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:06.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.036592    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.036607    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.036612    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.038642    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:06.535656    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.535670    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.535679    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.535682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.538248    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.036322    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.036396    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.036407    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.036412    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.038943    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.039003    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:07.536357    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.536370    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.536379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.536384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.538778    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.036360    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.036375    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.036381    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.036384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.038393    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.536197    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.536266    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.538997    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.036883    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.036911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.036918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.036922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.039071    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.039137    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:09.535649    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.535664    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.535673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.535677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.537998    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.036229    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.036241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.036247    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.039273    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.536564    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.536575    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.536585    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.538369    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:11.036693    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.036710    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.036749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.038831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.535438    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.535452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.535461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.537490    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.537597    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:12.035786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.035805    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.035812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.035816    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.038145    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:12.536840    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.536858    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.536868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.536881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.538815    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.037034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.037049    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.037056    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.037059    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.038933    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.535502    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.535519    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.535593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.537560    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.537648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:14.036280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.036300    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.036312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.036322    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.039000    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:14.535507    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.535527    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.535537    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.538228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.036543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.036634    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.036643    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.039762    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:15.535993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.536006    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.536012    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.536015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.538186    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.538254    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:16.035582    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.035595    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.035602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:16.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.536663    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.536709    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.536713    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.538604    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:17.036351    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.036372    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.036393    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.039451    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:17.536542    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.536560    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.536573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.539454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:17.539591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:18.036512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.036578    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.036588    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.038886    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:18.535537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.535554    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.535559    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.537559    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:19.035943    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.035968    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.035980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.035987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.038665    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:19.536893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.536911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.536920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.536925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.539416    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.036463    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.036479    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.036495    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.036500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.038824    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.038907    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:20.536286    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.536306    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.536313    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.536316    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.538429    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.036034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.036055    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.038101    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.535690    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.535732    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.535740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.538264    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.036592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.036604    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.036610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.038773    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.536090    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.536103    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.536109    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.536114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.537988    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:22.538057    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:23.035526    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.035555    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.035562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.035567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.037480    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:23.536652    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.536666    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.536673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.536677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.538667    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.036746    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.036766    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.036778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.036789    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.039353    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:24.536440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.536452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.536459    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.536463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.538250    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.538315    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:25.036622    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.036643    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.036656    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.036666    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.039764    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:25.535710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.535721    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.535737    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.535742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:26.036253    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.036276    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.036338    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.036343    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.038674    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.536815    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.536828    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.536834    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.536838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.538932    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:27.035852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.035864    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.035869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.035872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.038024    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:27.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.536016    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.536028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.536036    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.539189    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:28.035934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.036002    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.036011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.036014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.037996    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:28.535538    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.535554    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.535561    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.535563    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:29.037018    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.037032    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.037039    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.037042    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.038983    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:29.039043    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:29.535757    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.535769    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.535775    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.535778    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.537697    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:30.036529    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.036548    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.036557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.038833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:30.535560    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.535570    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.535576    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.535579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.537657    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.035508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.035520    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.035527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.035531    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.037575    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.536786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.536800    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.536806    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.536809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.538674    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:31.538731    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:32.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.035833    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.035842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.035848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.038170    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:32.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.535471    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.535481    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.537802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.037123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.037156    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.037166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.037171    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.039252    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.535754    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.535760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.535763    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.537979    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.035638    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.035651    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.035658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.035661    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.037722    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:34.535808    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.535823    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.535831    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.535834    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.538223    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:35.036584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.036609    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.036620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.036625    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.039788    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:35.535720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.535732    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.535738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.535741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.537506    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:36.036439    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.036484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.036492    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.036498    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.038534    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:36.038591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:36.535446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.535465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.535467    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.537309    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:37.035737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.035776    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.035789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.035794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.037928    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:37.535410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.535422    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.535430    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.535433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.537627    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.036658    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.036738    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.036760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.039378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:38.535459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.535474    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.535490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.535494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.537817    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.036931    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.036949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.036957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.036962    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.039286    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.536472    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.536487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.536491    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.538440    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:40.036354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.036378    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.036463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.036469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.535847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.535866    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.535883    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.538740    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.538822    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:41.036206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.036229    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.036234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.038292    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:41.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.535753    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.535764    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.537837    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.036558    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.036566    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.036570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.039104    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.536474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.536484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.536491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.538339    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:43.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.035913    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.035925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.035931    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.038963    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:43.039028    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:43.537036    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.537050    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.537056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.537059    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.539282    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:44.035937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.035949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.035954    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.035958    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.037693    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:44.536399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.536470    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.536481    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.536485    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.538818    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.036937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.036960    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.036966    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.039449    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:45.535403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.535415    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.535421    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.535424    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.537208    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:46.037001    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.037088    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.037104    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.037110    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.040342    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:46.536255    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.536269    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.538801    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:47.037251    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.037286    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.037297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.037304    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.039048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.537021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.537064    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.537071    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.537076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.539084    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.539154    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:48.037354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.037369    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.037376    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.037379    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.039646    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:48.536219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.536236    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.536272    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.536276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.538242    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:49.035446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.035459    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.035465    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.035469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.037563    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:49.535517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.535540    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.537433    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:50.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.036659    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.036665    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.036670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.038735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:50.038803    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:50.535659    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.535678    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.535690    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.535697    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.538598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.036768    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.036782    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.036789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.036794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.038898    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.536592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.536608    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.536616    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.536621    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.539087    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:52.036618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.036639    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.036652    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.036658    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.039828    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:52.039911    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:52.535902    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.537950    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.036705    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.036716    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.036721    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.039002    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.535467    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.535473    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.535476    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.537615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.036291    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.036325    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.036406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.036414    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.039211    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.535751    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.535763    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.535769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.535772    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.537488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:54.537606    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:55.036966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.036982    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.036988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.038791    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:55.537260    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.537303    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.537312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.537315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.539579    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.036346    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.036359    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.036367    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.036370    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.038527    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.536015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.536055    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.536063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.536068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:56.538106    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:57.036625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.036637    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.036646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.038481    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:57.536731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.536744    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.536749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:58.037081    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.037160    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.037174    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.037182    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.040222    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:58.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.535453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.535460    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.535463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.537373    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:59.037130    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.037151    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.037161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.037181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.039237    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:59.039342    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:59.536756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.536768    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.536774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.536777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.538430    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:00.036701    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.036714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.036720    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.036723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.038842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:00.535558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.535574    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.535620    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.535625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.537993    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.036274    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.036293    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.036302    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.036305    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.038700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.536455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.536488    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.536511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.538672    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.538736    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:02.036272    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.036286    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.036291    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.036295    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:02.535392    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.535405    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.535416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.535419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.537336    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.036249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.036264    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.036271    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.036276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.038181    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.536990    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.537012    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.537020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.537024    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.541054    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:09:03.541125    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:04.036809    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.036887    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.036896    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.036902    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.039202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:04.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.537152    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.537166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.537904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.540615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.036817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.036832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.036838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.036842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.038865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.535412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.535430    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.535438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.535446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.538103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.036140    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.036160    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.036172    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.039025    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:06.536908    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.536923    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.536930    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.536933    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.538854    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:07.035951    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.035965    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.035974    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.035979    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.038105    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:07.535618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.535629    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.535635    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.535637    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.537552    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:08.036184    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.036212    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.036273    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.036279    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.038850    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.536040    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.536056    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.536065    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.536069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.538402    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.538460    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:09.036971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.037018    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.037025    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.037031    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.039100    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:09.535468    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.535480    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.535490    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.537589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.035464    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.035479    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.035491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.035506    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.037831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.536550    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.536622    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.536632    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.536638    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.539005    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.539064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:11.037316    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.037399    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.037415    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.037425    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.040113    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:11.536965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.536989    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.537033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.537044    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.539689    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:12.036399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.036469    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.036480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.036486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.038399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:12.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.535463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.535486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.539207    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:12.539333    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:13.036110    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.036220    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.036236    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:13.535970    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.535990    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.536002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.536008    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.539197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:14.037193    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.037263    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.037274    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.037286    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.039603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:14.535571    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.535594    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.036611    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.036630    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.036642    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.036648    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.039592    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.039739    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:15.535565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.535590    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.535602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.535608    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.539127    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.035884    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.035904    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.035915    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.035919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.038938    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.535882    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.535893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.535900    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.535904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.537836    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:17.036590    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.036605    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.036618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.039082    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:17.535436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.535454    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:17.539295    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:18.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.035505    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.035509    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.037946    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:18.536869    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.536890    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.538941    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:19.035847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.035859    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.035865    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.035868    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.037761    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:19.536117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.536142    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.536154    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.536160    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:19.539466    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:20.036919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.036993    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.037004    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.037009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.039230    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:20.536619    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.536716    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.536731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.536738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.539591    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.036024    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.036114    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.036129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.036136    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.038666    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.535434    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.535447    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.535453    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.535457    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.537251    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:22.037204    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.037219    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.037228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.037234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.039524    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:22.039581    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:22.536431    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.536450    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.536464    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.536473    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.539233    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.035562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.035606    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.035627    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.035634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.037971    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.536675    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.536742    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.539879    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:24.035514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.035529    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.035535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.035544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.037431    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:24.536058    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.536156    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.536171    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.536179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.538730    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:24.538810    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:25.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.036804    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.036814    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.036821    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.039117    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:25.535569    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.535587    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.535596    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.538114    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.035517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.035542    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.035556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.035562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.038485    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.536365    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.536379    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.536386    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.536390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.538690    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:27.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.036652    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.036703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.036709    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.038432    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:27.038498    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:27.535539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.535560    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.535580    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.538434    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.035626    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.035644    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.035647    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.037699    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.536177    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.536199    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.536212    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.536217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.539218    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:29.036925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.036950    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.036962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.036969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.040007    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:29.040064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:29.537194    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.537209    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.537228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.537240    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.539598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.036373    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.036494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.039302    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.536789    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.536807    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.536815    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.536820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.539885    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.036624    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.036635    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.039815    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.536285    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.536295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.536301    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.538680    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:31.538744    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:32.036451    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.036463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.036469    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.036472    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.038847    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:32.536969    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.537019    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.537032    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.537041    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.539636    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.035557    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.035573    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.035582    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.035587    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.535485    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.535509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.535522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.535529    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.538268    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.035811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.035830    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.035841    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.035846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.038580    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.038645    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:34.535515    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.535562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.537523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.036865    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.036880    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.036887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.036890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.038894    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.535476    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.535574    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.535579    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.537495    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:36.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.036227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.038994    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:36.039061    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:36.536105    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.536117    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.536124    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.536127    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.036134    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.536082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.536101    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.536110    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.536114    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.538459    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.035493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.035509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.035517    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.035524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.037791    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.535613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.535632    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.535645    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.535668    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.539185    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:38.539281    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:39.036660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.036682    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.036693    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.036700    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.039452    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:39.535986    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.536000    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.536007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.536011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.537968    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:40.036939    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.037010    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.037021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.037026    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.039435    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:40.536149    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.536171    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.536233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.536239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.538338    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.036629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.036641    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.036647    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.036651    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.038835    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.038897    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:41.536269    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.536280    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.536287    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.536290    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.538277    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:42.036495    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.036511    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.036520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.036524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.038560    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:42.537182    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.537201    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.537210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.537215    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.539833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.035857    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.035874    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.035881    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.035891    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.038530    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.536377    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.536465    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.536480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.536488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.539159    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.539217    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:44.036979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.037065    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.037081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.037089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.039312    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:44.536993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.537011    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.537018    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.537063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.539131    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.036929    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.037050    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.037064    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.039700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.537112    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.537123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.539940    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.540011    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:46.036811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.036857    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.036882    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.039540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:46.535831    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.535845    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.535852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.535856    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.538387    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:47.036117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.036128    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.036134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.036137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.037871    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:47.536504    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.536553    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.536564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.536568    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:48.036960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.036980    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.036998    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.040512    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:48.041066    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:48.535514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.535532    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.535542    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.535547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.537881    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.036133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.038899    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.536876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.536893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.536899    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.538675    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.037190    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.037204    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.037213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.037216    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.039015    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.536824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.536920    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.536935    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.536942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.539735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:50.539808    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:51.035683    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.035696    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.035702    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.035706    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.038883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:51.536861    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.536882    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.536894    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.536901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.539779    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:52.035474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.035485    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.035493    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.035499    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.037401    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:52.536642    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.536661    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.536669    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.536674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.036427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.036482    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.036487    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.038951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.039010    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:53.535427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.535450    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.537257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:54.036806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.036828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.036832    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.039021    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:54.535805    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.535897    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.535912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.535919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.538990    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:55.036521    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.036539    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.036546    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.036549    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.038766    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.536714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.536723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.536727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.539055    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.539163    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:56.035522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.035534    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.035541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.035545    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:56.535916    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.535934    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.535943    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.535949    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.538329    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:57.036391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.036406    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.036413    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.036417    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.038267    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:57.535390    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.535447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.535452    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.537243    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.036778    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.036805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.036809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.038620    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.038682    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:58.536471    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.536516    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.536532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.538643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:59.035837    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.035851    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.035858    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.035861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.037705    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:59.536730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.536832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.536848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.536854    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.539682    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.035558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.035587    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.035600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.035612    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.037523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:00.535512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.535528    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.535534    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.535537    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.537603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.537667    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:01.036888    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.036943    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.036951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.036955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.038774    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:01.535488    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.535504    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.535513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.535517    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.538017    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.036031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.036054    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.037488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:02.537218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.537285    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.537295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.537300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.539559    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.539701    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:03.036241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.036256    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.036263    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.036269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.037763    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:03.536877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.536892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.536901    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.539168    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:04.035721    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.035733    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.035739    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.035742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.037607    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:04.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.535694    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.535703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.535707    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.537920    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:05.037180    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.037195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.037201    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.037205    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:05.038947    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:05.536233    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.536248    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.536254    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.536258    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.538191    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.036830    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.036845    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.036852    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.036856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.038427    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.536722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.536735    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.536741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.536753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.538631    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.036171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.036186    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.036192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.036195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.038330    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:07.536466    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.536481    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.536488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.538446    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.538510    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:08.036787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.036832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.036853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.039084    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:08.535567    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.535582    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.535589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.535593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.537711    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.035421    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.035432    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.035438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.035442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.037921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.535887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.535904    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.535913    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.535943    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.538516    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.538592    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:10.035458    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.035469    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.035474    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.035477    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.038652    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:10.535979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.535992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.535998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.536002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.537981    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:11.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.035886    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.035897    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.035901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.038043    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:11.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.535487    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.535494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.535497    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.537395    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:12.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.036591    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.036598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.036601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.038621    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:12.038676    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:12.536927    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.536941    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.536947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.536952    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.539050    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:13.036386    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.036399    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.036428    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.036433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.038022    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:13.536356    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.536376    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.536403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.536406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.035960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.035973    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.035979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.035983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.037566    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.535889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.535909    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.535920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.535926    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:14.538873    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:15.037263    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.037278    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.037284    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.037291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.038934    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:15.535930    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.535949    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.535957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.535961    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.538412    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:16.035774    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.035790    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.035798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.035803    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.037617    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:16.536338    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.536352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.536359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.536362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.538545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.036602    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.036625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.039042    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:17.535886    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.535901    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.535907    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.535910    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.538060    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:18.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.036938    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.036947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.036950    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.038702    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:18.535556    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.535580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.535586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.537620    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.035993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.036009    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.036017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.036021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.536410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.536433    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.536444    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.536452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.539613    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:19.539694    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:20.035430    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.035445    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.035456    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.035466    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.037008    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:20.536812    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.536836    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.536849    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.536855    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.539846    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.035746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.035755    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.035761    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.037893    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.536119    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.536158    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.536173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.536181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.035742    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.035796    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.038072    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.038175    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:22.536977    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.536992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.536999    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.537002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.539319    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:23.036522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.036538    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.036544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.036547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.038326    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:23.537176    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.537194    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.537202    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.537208    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.539537    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:24.036672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.036686    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.036692    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.036696    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.038290    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:24.038347    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:24.536490    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.536508    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.536519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.536525    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.539462    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:25.036309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.036323    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.036329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.036332    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.038173    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:25.535523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.535539    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.535547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.535552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.538454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:26.035663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.035681    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.035719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.035722    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.037593    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.536821    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.536893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.538841    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.538912    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:27.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.036734    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.036740    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.036743    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.038648    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:27.537059    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.537079    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.537111    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.537116    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.539595    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:28.035398    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.035411    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.035417    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.035421    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.037116    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:28.536047    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.536115    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.536125    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.536133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.538589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.036033    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.036048    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.036055    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.036058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.038794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.038860    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:29.536173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.536187    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.536193    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.536198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.538161    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:30.036950    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.037050    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.037065    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.037072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.039996    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:30.536407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.536424    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.036484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.036581    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.036600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.039439    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:31.535848    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.535863    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.535872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.036070    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.036083    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.036092    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.036097    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.038358    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.535559    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.535583    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.535597    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.535604    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.538962    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:33.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.035880    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.035887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.035890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.038234    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:33.536345    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.536363    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.536408    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.536413    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.538408    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:33.538470    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:34.035876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.035911    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.035917    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.038813    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:34.535532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.535555    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.535599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.535611    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.036525    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.036545    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.036557    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.036565    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.039453    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.536317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.536338    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.536346    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.536351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.538546    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.538604    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:36.035614    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.035632    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.035642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.035648    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.037951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:36.535593    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.535610    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.535620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.535627    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.538091    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.035952    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.035972    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.035984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.035992    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.039078    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:37.536397    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.536416    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.536425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.536431    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.538652    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.538721    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:38.036647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.036688    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.036697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.036702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.038657    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:38.535391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.535469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.535474    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.537747    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:39.036877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.036896    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.036908    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.036916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.039937    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.537361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.537463    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.537475    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.537480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.540492    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.540575    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:40.035736    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.035759    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.035797    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.035817    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.038896    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:40.536124    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.536136    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.536142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.536147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.538082    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:41.036456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.036502    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.036513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.036519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.038631    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:41.535516    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.535529    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.535535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.035758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.035795    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.038565    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.038648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:42.536775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.536801    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.536856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.536867    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.539883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:43.036733    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.036747    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.036754    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.036758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.038792    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:43.536704    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.536719    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.536725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.536730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.538830    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.037317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.037342    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.037351    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.037356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.040355    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.040430    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:44.537337    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.537352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.537358    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.537362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.539426    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.036153    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.036187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.036193    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.039178    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.535572    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.535584    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.535596    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.537420    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:46.037146    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.037161    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.037168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.037199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.039539    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.536761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.536842    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.536857    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.536863    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.539600    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.539683    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:47.037209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.037228    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.037237    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.037243    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.039381    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:47.536097    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.536127    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.536138    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.536143    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.540045    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:48.035580    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.035598    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.035610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.037609    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:48.535945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.535960    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.535966    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.535969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.537852    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:49.036904    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.036928    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.036941    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.036946    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.039794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:49.039868    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:49.536635    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.536649    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.536699    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.536704    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.035497    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.035500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.037398    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.536222    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.536321    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.536335    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.536342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.035748    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.035813    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.035820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.536457    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.536471    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.536480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.538865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.538935    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:52.036481    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.036503    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.036583    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.036593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.039545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:52.536583    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.536620    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.536636    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.539115    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:53.037214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.037226    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.037256    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.037262    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.039257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:53.535880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.535892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.535898    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.535901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.538097    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.035680    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.035691    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.035697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.035702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.037758    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.037819    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:54.536181    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.536195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.536250    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.536256    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.538069    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:55.036750    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.036858    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.036874    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.036881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.040140    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:55.535731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.535746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.535752    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.535755    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.537710    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:56.037367    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.037382    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.037392    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.037396    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.039716    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:56.039828    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:56.535738    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.535750    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.535757    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.535760    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.537553    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:57.036797    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.036852    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.036859    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.036862    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.038921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:57.535419    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.535437    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.535452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.035459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.035475    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.035484    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.035488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.037963    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.536607    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.536625    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.536640    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.536653    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.539173    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.539233    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:59.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.035890    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.035912    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:59.535411    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.535426    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.535432    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.535434    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.537913    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.036663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.036679    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.036686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.036690    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.038915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.536586    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.536602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.536610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.536615    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.538823    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.037017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.037041    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.037053    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.037058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.039885    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.039956    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:01.537010    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.537022    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.537028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.537032    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.538870    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:02.036801    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.036819    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.036827    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.036831    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.039277    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:02.535479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.535495    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.535501    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.535505    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.537168    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:03.037023    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.037069    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.037079    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.037084    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.536060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.536073    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.536079    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.536083    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.539021    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:04.036364    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.036379    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.036390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:04.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.536251    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.536260    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.536264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.538409    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:05.035688    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.035701    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.035708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.035712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.037474    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:05.535639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.535661    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.535671    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.535676    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.538235    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.036540    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.036564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.039139    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.039201    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:06.536852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.536867    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.536875    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.536879    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.539160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:07.037400    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.037412    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.037419    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.037422    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.039316    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:07.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.535496    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.535507    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.538665    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:11:08.035588    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.035602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.035609    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.035614    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.037450    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.535606    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.535617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.535624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.535628    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.537643    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.537700    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:09.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.036549    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.036556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.036560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.038511    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:09.536726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.536794    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.536805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.536810    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.036626    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.038891    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.535919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.535991    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.536003    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.536009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.538198    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.538256    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:11.035775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.035789    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.037602    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:11.535963    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.535977    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.535984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.535988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.035422    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.035494    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.035509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.035514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.037902    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.536484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.536500    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.536506    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.536510    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.538333    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:12.538392    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:12.538407    3827 node_ready.go:38] duration metric: took 4m0.003142979s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:11:12.560167    3827 out.go:177] 
	W0731 10:11:12.580908    3827 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0731 10:11:12.580926    3827 out.go:239] * 
	W0731 10:11:12.582125    3827 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:11:12.680641    3827 out.go:177] 
	
	
	==> Docker <==
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.914226423Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928630776Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928700349Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928854780Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.929029367Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.930900389Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.930985608Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.931085246Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.931220258Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928429805Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.933866106Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.933878374Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.934267390Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953115079Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953269656Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953688559Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953968281Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:38 ha-393000 dockerd[1174]: time="2024-07-31T17:06:38.259320248Z" level=info msg="ignoring event" container=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259503796Z" level=info msg="shim disconnected" id=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 namespace=moby
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259817357Z" level=warning msg="cleaning up after shim disconnected" id=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 namespace=moby
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259827803Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937784723Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937892479Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937935988Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.938076078Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	21ff27483d07f       6e38f40d628db                                                                                         4 minutes ago       Running             storage-provisioner       2                   31c959cec2158       storage-provisioner
	7500c837dfe73       8c811b4aec35f                                                                                         5 minutes ago       Running             busybox                   1                   f5579bdb56284       busybox-fc5497c4f-b94zr
	492e11c732d18       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   1                   22a2f7cb99560       coredns-7db6d8ff4d-wvqjl
	26d835568c733       cbb01a7bd410d                                                                                         5 minutes ago       Running             coredns                   1                   8336e3fbaa274       coredns-7db6d8ff4d-5m8st
	193af4895baf9       6f1d07c71fa0f                                                                                         5 minutes ago       Running             kindnet-cni               1                   304fa6a12c82b       kindnet-hjm7c
	4f56054bbee16       55bb025d2cfa5                                                                                         5 minutes ago       Running             kube-proxy                1                   7e638ed37b5ca       kube-proxy-zc52f
	c2de84de71d0d       6e38f40d628db                                                                                         5 minutes ago       Exited              storage-provisioner       1                   31c959cec2158       storage-provisioner
	42b34888f43b4       76932a3b37d7e                                                                                         5 minutes ago       Running             kube-controller-manager   6                   dd7a38b9a9134       kube-controller-manager-ha-393000
	bf0af6a864492       38af8ddebf499                                                                                         5 minutes ago       Running             kube-vip                  1                   7ae512ce66d9e       kube-vip-ha-393000
	0a6a6d756b8d8       76932a3b37d7e                                                                                         5 minutes ago       Exited              kube-controller-manager   5                   dd7a38b9a9134       kube-controller-manager-ha-393000
	a34d35a3b612b       3edc18e7b7672                                                                                         5 minutes ago       Running             kube-scheduler            2                   b550834f339ce       kube-scheduler-ha-393000
	488f4fddc126e       3861cfcd7c04c                                                                                         5 minutes ago       Running             etcd                      2                   35bc88d55a5f9       etcd-ha-393000
	7e0d32286913b       1f6d574d502f3                                                                                         5 minutes ago       Running             kube-apiserver            5                   913ebe1d27d36       kube-apiserver-ha-393000
	aec44315311a1       1f6d574d502f3                                                                                         7 minutes ago       Exited              kube-apiserver            4                   194073f1c5ac9       kube-apiserver-ha-393000
	86018b08bbaa1       3861cfcd7c04c                                                                                         10 minutes ago      Exited              etcd                      1                   ba75e4f4299bf       etcd-ha-393000
	5fcb6f7d8ab78       38af8ddebf499                                                                                         10 minutes ago      Exited              kube-vip                  0                   e6198932cc027       kube-vip-ha-393000
	d088fefe5f8e3       3edc18e7b7672                                                                                         10 minutes ago      Exited              kube-scheduler            1                   f04a7ecd568d2       kube-scheduler-ha-393000
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   14 minutes ago      Exited              busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         16 minutes ago      Exited              coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         16 minutes ago      Exited              coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              17 minutes ago      Exited              kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         17 minutes ago      Exited              kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	
	
	==> coredns [26d835568c73] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45868 - 37816 "HINFO IN 2903702352377705943.3393804209116430399. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.009308312s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[336879232]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.192) (total time: 30001ms):
	Trace[336879232]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.193)
	Trace[336879232]: [30.001669762s] [30.001669762s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[792684680]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.192) (total time: 30002ms):
	Trace[792684680]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.193)
	Trace[792684680]: [30.002844954s] [30.002844954s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[252017809]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.190) (total time: 30004ms):
	Trace[252017809]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.192)
	Trace[252017809]: [30.004125023s] [30.004125023s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [492e11c732d1] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:50203 - 38178 "HINFO IN 6515882504773672893.3508195612419770899. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.008964582s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1731745039]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.202) (total time: 30000ms):
	Trace[1731745039]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.203)
	Trace[1731745039]: [30.000463s] [30.000463s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1820975691]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.203) (total time: 30000ms):
	Trace[1820975691]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.203)
	Trace[1820975691]: [30.00019609s] [30.00019609s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[58591392]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.202) (total time: 30001ms):
	Trace[58591392]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.203)
	Trace[58591392]: [30.001286385s] [30.001286385s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [feda36fb8a03] <==
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               ha-393000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:53:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:11:17 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 17:06:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-393000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 9e10f5eb61854acbaf6547934383ee12
	  System UUID:                2cfe48dd-0000-0000-9b98-537ad9823a95
	  Boot ID:                    b9343713-c701-4963-b11c-cdefca0b39ab
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-b94zr              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 coredns-7db6d8ff4d-5m8st             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     17m
	  kube-system                 coredns-7db6d8ff4d-wvqjl             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     17m
	  kube-system                 etcd-ha-393000                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         17m
	  kube-system                 kindnet-hjm7c                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      17m
	  kube-system                 kube-apiserver-ha-393000             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-controller-manager-ha-393000    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-proxy-zc52f                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-scheduler-ha-393000             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-vip-ha-393000                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m13s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 17m                    kube-proxy       
	  Normal  Starting                 5m11s                  kube-proxy       
	  Normal  Starting                 17m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  17m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  17m                    kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    17m                    kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     17m                    kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           17m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeReady                16m                    kubelet          Node ha-393000 status is now: NodeReady
	  Normal  RegisteredNode           15m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           14m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           12m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeHasSufficientMemory  5m57s (x8 over 5m57s)  kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  Starting                 5m57s                  kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    5m57s (x8 over 5m57s)  kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m57s (x7 over 5m57s)  kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  5m57s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           5m15s                  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           5m7s                   node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           4m38s                  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	
	
	Name:               ha-393000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:55:06 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:11:15 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:27 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-393000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 83c6a2bd65fe41eb8d2ed449f1d84121
	  System UUID:                7863443c-0000-0000-8e8d-bbd47bc06547
	  Boot ID:                    aad47d4e-f7f0-4bd8-87b6-edfb69496407
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-zln22                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 etcd-ha-393000-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         16m
	  kube-system                 kindnet-lcwbs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      16m
	  kube-system                 kube-apiserver-ha-393000-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-controller-manager-ha-393000-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-proxy-cf577                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-scheduler-ha-393000-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-vip-ha-393000-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 5m28s                  kube-proxy       
	  Normal   Starting                 12m                    kube-proxy       
	  Normal   Starting                 16m                    kube-proxy       
	  Normal   NodeAllocatableEnforced  16m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  16m (x8 over 16m)      kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    16m (x8 over 16m)      kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     16m (x7 over 16m)      kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           16m                    node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           15m                    node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           14m                    node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   Starting                 12m                    kubelet          Starting kubelet.
	  Warning  Rebooted                 12m                    kubelet          Node ha-393000-m02 has been rebooted, boot id: febe9487-cc37-4f76-a943-4c3bd5898a28
	  Normal   NodeHasSufficientPID     12m                    kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  12m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  12m                    kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    12m                    kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           12m                    node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   Starting                 5m38s                  kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  5m38s (x8 over 5m38s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m38s (x8 over 5m38s)  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m38s (x7 over 5m38s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  5m38s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           5m15s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           5m7s                   node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           4m38s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	
	
	Name:               ha-393000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:56:18 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:11:11 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:06:25 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:06:25 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:06:25 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:06:25 +0000   Wed, 31 Jul 2024 16:56:40 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-393000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 6bd67d455470412d948a97ba6f8b8a9a
	  System UUID:                451d42a6-0000-0000-8ccb-b8851dda0594
	  Boot ID:                    0d534f8f-f62b-4786-808f-39cb1c1bf961
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-n8d7h                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 etcd-ha-393000-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-s2pv6                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      15m
	  kube-system                 kube-apiserver-ha-393000-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 kube-controller-manager-ha-393000-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-proxy-cr9pg                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 kube-scheduler-ha-393000-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 kube-vip-ha-393000-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 4m50s              kube-proxy       
	  Normal   Starting                 14m                kube-proxy       
	  Normal   NodeAllocatableEnforced  15m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  15m (x8 over 15m)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    15m (x8 over 15m)  kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     15m (x7 over 15m)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           15m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           14m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           14m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           12m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           5m15s              node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           5m7s               node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   Starting                 4m54s              kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  4m54s              kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  4m54s              kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    4m54s              kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     4m54s              kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 4m54s              kubelet          Node ha-393000-m03 has been rebooted, boot id: 0d534f8f-f62b-4786-808f-39cb1c1bf961
	  Normal   RegisteredNode           4m38s              node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	
	
	==> dmesg <==
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.035849] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.008140] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.683009] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007123] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.689234] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.257015] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000008] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +2.569890] systemd-fstab-generator[473]: Ignoring "noauto" option for root device
	[  +0.101117] systemd-fstab-generator[485]: Ignoring "noauto" option for root device
	[  +1.260537] kauditd_printk_skb: 42 callbacks suppressed
	[  +0.721842] systemd-fstab-generator[1103]: Ignoring "noauto" option for root device
	[  +0.244917] systemd-fstab-generator[1140]: Ignoring "noauto" option for root device
	[  +0.105223] systemd-fstab-generator[1152]: Ignoring "noauto" option for root device
	[  +0.108861] systemd-fstab-generator[1166]: Ignoring "noauto" option for root device
	[  +2.483787] systemd-fstab-generator[1382]: Ignoring "noauto" option for root device
	[  +0.096628] systemd-fstab-generator[1394]: Ignoring "noauto" option for root device
	[  +0.110449] systemd-fstab-generator[1406]: Ignoring "noauto" option for root device
	[  +0.128159] systemd-fstab-generator[1422]: Ignoring "noauto" option for root device
	[  +0.446597] systemd-fstab-generator[1585]: Ignoring "noauto" option for root device
	[  +6.854766] kauditd_printk_skb: 271 callbacks suppressed
	[ +21.847998] kauditd_printk_skb: 40 callbacks suppressed
	[Jul31 17:06] kauditd_printk_skb: 80 callbacks suppressed
	
	
	==> etcd [488f4fddc126] <==
	{"level":"warn","ts":"2024-07-31T17:06:16.678285Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:16.778473Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:16.87759Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:16.978537Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.078586Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.089155Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.090449Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.157185Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.16138Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:17.177991Z","caller":"rafthttp/peer.go:267","msg":"dropped internal Raft message since sending buffer is full (overloaded network)","message-type":"MsgHeartbeat","local-member-id":"b8c6c7563d17d844","from":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152","remote-peer-name":"pipeline","remote-peer-active":false}
	{"level":"warn","ts":"2024-07-31T17:06:19.235777Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.7:2380/version","remote-member-id":"cc1c22e219d8e152","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:19.235865Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"cc1c22e219d8e152","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:20.081142Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:20.096152Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:23.237137Z","caller":"etcdserver/cluster_util.go:294","msg":"failed to reach the peer URL","address":"https://192.169.0.7:2380/version","remote-member-id":"cc1c22e219d8e152","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:23.237345Z","caller":"etcdserver/cluster_util.go:158","msg":"failed to get version","remote-member-id":"cc1c22e219d8e152","error":"Get \"https://192.169.0.7:2380/version\": dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:25.082226Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:06:25.096276Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: connection refused"}
	{"level":"info","ts":"2024-07-31T17:06:27.159074Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:06:27.159122Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:06:27.167117Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:06:27.326929Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"cc1c22e219d8e152","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-07-31T17:06:27.327046Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:06:27.348194Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"cc1c22e219d8e152","stream-type":"stream Message"}
	{"level":"info","ts":"2024-07-31T17:06:27.348297Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	
	
	==> etcd [86018b08bbaa] <==
	{"level":"info","ts":"2024-07-31T17:04:54.706821Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:54.70684Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:54.70685Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:04:55.363421Z","caller":"etcdserver/server.go:2089","msg":"failed to publish local member to cluster through raft","local-member-id":"b8c6c7563d17d844","local-member-attributes":"{Name:ha-393000 ClientURLs:[https://192.169.0.5:2379]}","request-path":"/0/members/b8c6c7563d17d844/attributes","publish-timeout":"7s","error":"etcdserver: request timed out"}
	{"level":"warn","ts":"2024-07-31T17:04:55.539618Z","caller":"etcdhttp/health.go:232","msg":"serving /health false; no leader"}
	{"level":"warn","ts":"2024-07-31T17:04:55.539664Z","caller":"etcdhttp/health.go:119","msg":"/health error","output":"{\"health\":\"false\",\"reason\":\"RAFT NO LEADER\"}","status-code":503}
	{"level":"info","ts":"2024-07-31T17:04:56.510556Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.510829Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.511027Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.51112Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.511212Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306509Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306743Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306923Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.307075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.307212Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404702Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404767Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404769Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:04:59.405991Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"info","ts":"2024-07-31T17:05:00.106932Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106958Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106967Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106977Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106982Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	
	
	==> kernel <==
	 17:11:20 up 6 min,  0 users,  load average: 0.32, 0.28, 0.12
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [193af4895baf] <==
	I0731 17:10:39.076498       1 main.go:299] handling current node
	I0731 17:10:49.072088       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:10:49.072129       1 main.go:299] handling current node
	I0731 17:10:49.072141       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:10:49.072146       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:10:49.072341       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:10:49.072369       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:10:59.068725       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:10:59.069056       1 main.go:299] handling current node
	I0731 17:10:59.069318       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:10:59.069379       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:10:59.069623       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:10:59.069809       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:11:09.067717       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:11:09.067835       1 main.go:299] handling current node
	I0731 17:11:09.067855       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:11:09.067865       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:11:09.068308       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:11:09.068348       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:11:19.076310       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:11:19.076570       1 main.go:299] handling current node
	I0731 17:11:19.076705       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:11:19.076850       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:11:19.077151       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:11:19.077244       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:59:40.110698       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:50.118349       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:50.118427       1 main.go:299] handling current node
	I0731 16:59:50.118450       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:50.118464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:50.118651       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:50.118739       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.118883       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:00.118987       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:00.119126       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:00.119236       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.119356       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:00.119483       1 main.go:299] handling current node
	I0731 17:00:10.110002       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:10.111054       1 main.go:299] handling current node
	I0731 17:00:10.111286       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:10.111319       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:10.111445       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:10.111480       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:20.116250       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:20.116442       1 main.go:299] handling current node
	I0731 17:00:20.116458       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:20.116464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:20.116608       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:20.116672       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [7e0d32286913] <==
	I0731 17:05:50.070570       1 controller.go:80] Starting OpenAPI V3 AggregationController
	I0731 17:05:50.074783       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0731 17:05:50.074947       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:05:50.086677       1 crdregistration_controller.go:111] Starting crd-autoregister controller
	I0731 17:05:50.086708       1 shared_informer.go:313] Waiting for caches to sync for crd-autoregister
	I0731 17:05:50.117864       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0731 17:05:50.122120       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:05:50.122365       1 policy_source.go:224] refreshing policies
	I0731 17:05:50.132563       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0731 17:05:50.166384       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0731 17:05:50.168074       1 shared_informer.go:320] Caches are synced for configmaps
	I0731 17:05:50.168116       1 apf_controller.go:379] Running API Priority and Fairness config worker
	I0731 17:05:50.168122       1 apf_controller.go:382] Running API Priority and Fairness periodic rebalancing process
	I0731 17:05:50.170411       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0731 17:05:50.174248       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0731 17:05:50.178334       1 handler_discovery.go:447] Starting ResourceDiscoveryManager
	I0731 17:05:50.187980       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0731 17:05:50.188024       1 aggregator.go:165] initial CRD sync complete...
	I0731 17:05:50.188030       1 autoregister_controller.go:141] Starting autoregister controller
	I0731 17:05:50.188034       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0731 17:05:50.188038       1 cache.go:39] Caches are synced for autoregister controller
	E0731 17:05:50.205462       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0731 17:05:51.075340       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0731 17:06:47.219071       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0731 17:07:08.422863       1 controller.go:615] quota admission added evaluator for: endpoints
	
	
	==> kube-apiserver [aec44315311a] <==
	I0731 17:03:27.253147       1 options.go:221] external host was not specified, using 192.169.0.5
	I0731 17:03:27.253888       1 server.go:148] Version: v1.30.3
	I0731 17:03:27.253988       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:03:27.786353       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0731 17:03:27.788898       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:03:27.790619       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0731 17:03:27.790629       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0731 17:03:27.790778       1 instance.go:299] Using reconciler: lease
	W0731 17:03:47.786207       1 logging.go:59] [core] [Channel #1 SubChannel #3] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0731 17:03:47.786314       1 logging.go:59] [core] [Channel #2 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0731 17:03:47.791937       1 instance.go:292] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-controller-manager [0a6a6d756b8d] <==
	I0731 17:05:30.561595       1 serving.go:380] Generated self-signed cert in-memory
	I0731 17:05:31.250391       1 controllermanager.go:189] "Starting" version="v1.30.3"
	I0731 17:05:31.250471       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:05:31.252077       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0731 17:05:31.252281       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:05:31.252444       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:05:31.254793       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0731 17:05:51.257636       1 controllermanager.go:234] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: an error on the server (\"[+]ping ok\\n[+]log ok\\n[+]etcd ok\\n[+]poststarthook/start-apiserver-admission-initializer ok\\n[+]poststarthook/generic-apiserver-start-informers ok\\n[+]poststarthook/priority-and-fairness-config-consumer ok\\n[+]poststarthook/priority-and-fairness-filter ok\\n[+]poststarthook/storage-object-count-tracker-hook ok\\n[+]poststarthook/start-apiextensions-informers ok\\n[+]poststarthook/start-apiextensions-controllers ok\\n[+]poststarthook/crd-informer-synced ok\\n[+]poststarthook/start-service-ip-repair-controllers ok\\n[-]poststarthook/rbac/bootstrap-roles failed: reason withheld\\n[+]poststarthook/scheduling/bootstrap-system-priority-classes ok\\n[+]poststarthook/priority-and-fairness-config-producer ok\\n[+]poststarthook/start-system-namespaces-controller
ok\\n[+]poststarthook/bootstrap-controller ok\\n[+]poststarthook/start-cluster-authentication-info-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok\\n[+]poststarthook/start-legacy-token-tracking-controller ok\\n[+]poststarthook/aggregator-reload-proxy-client-cert ok\\n[+]poststarthook/start-kube-aggregator-informers ok\\n[+]poststarthook/apiservice-registration-controller ok\\n[+]poststarthook/apiservice-status-available-controller ok\\n[+]poststarthook/apiservice-discovery-controller ok\\n[+]poststarthook/kube-apiserver-autoregistration ok\\n[+]autoregister-completion ok\\n[+]poststarthook/apiservice-openapi-controller ok\\n[+]poststarthook/apiservice-openapiv3-controller ok\\nhealthz check failed\") has prevented the request from succeeding"
	
	
	==> kube-controller-manager [42b34888f43b] <==
	I0731 17:06:12.895845       1 shared_informer.go:320] Caches are synced for ReplicationController
	I0731 17:06:12.903208       1 shared_informer.go:320] Caches are synced for GC
	I0731 17:06:12.903274       1 shared_informer.go:320] Caches are synced for taint-eviction-controller
	I0731 17:06:12.920443       1 shared_informer.go:320] Caches are synced for endpoint_slice
	I0731 17:06:12.952902       1 shared_informer.go:320] Caches are synced for endpoint_slice_mirroring
	I0731 17:06:12.964558       1 shared_informer.go:320] Caches are synced for resource quota
	I0731 17:06:13.012295       1 shared_informer.go:320] Caches are synced for resource quota
	I0731 17:06:13.022225       1 shared_informer.go:320] Caches are synced for ClusterRoleAggregator
	I0731 17:06:13.501091       1 shared_informer.go:320] Caches are synced for garbage collector
	I0731 17:06:13.558892       1 shared_informer.go:320] Caches are synced for garbage collector
	I0731 17:06:13.559095       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0731 17:06:26.973668       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="55.100255ms"
	I0731 17:06:26.975840       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="54.971µs"
	I0731 17:06:29.221856       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="14.898144ms"
	I0731 17:06:29.222046       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="66.05µs"
	I0731 17:06:47.214265       1 endpointslice_controller.go:311] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mvtct\": the object has been modified; please apply your changes to the latest version and try again"
	I0731 17:06:47.214807       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"a0a921f4-5219-42ca-94c6-a4038d9ff710", APIVersion:"v1", ResourceVersion:"259", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mvtct": the object has been modified; please apply your changes to the latest version and try again
	I0731 17:06:47.241205       1 endpointslice_controller.go:311] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mvtct\": the object has been modified; please apply your changes to the latest version and try again"
	I0731 17:06:47.241526       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="78.18352ms"
	I0731 17:06:47.241539       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"a0a921f4-5219-42ca-94c6-a4038d9ff710", APIVersion:"v1", ResourceVersion:"259", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mvtct": the object has been modified; please apply your changes to the latest version and try again
	E0731 17:06:47.241671       1 replica_set.go:557] sync "kube-system/coredns-7db6d8ff4d" failed with Operation cannot be fulfilled on replicasets.apps "coredns-7db6d8ff4d": the object has been modified; please apply your changes to the latest version and try again
	I0731 17:06:47.242012       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="316.596µs"
	I0731 17:06:47.246958       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="100.8µs"
	I0731 17:06:47.288893       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="32.237881ms"
	I0731 17:06:47.289070       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.102µs"
	
	
	==> kube-proxy [4f56054bbee1] <==
	I0731 17:06:08.426782       1 server_linux.go:69] "Using iptables proxy"
	I0731 17:06:08.446564       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 17:06:08.497695       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 17:06:08.497829       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 17:06:08.497985       1 server_linux.go:165] "Using iptables Proxier"
	I0731 17:06:08.502095       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 17:06:08.503040       1 server.go:872] "Version info" version="v1.30.3"
	I0731 17:06:08.503116       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:06:08.506909       1 config.go:192] "Starting service config controller"
	I0731 17:06:08.507443       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 17:06:08.507578       1 config.go:319] "Starting node config controller"
	I0731 17:06:08.507600       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 17:06:08.509126       1 config.go:101] "Starting endpoint slice config controller"
	I0731 17:06:08.509154       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 17:06:08.607797       1 shared_informer.go:320] Caches are synced for node config
	I0731 17:06:08.607880       1 shared_informer.go:320] Caches are synced for service config
	I0731 17:06:08.610417       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [a34d35a3b612] <==
	I0731 17:05:30.706492       1 serving.go:380] Generated self-signed cert in-memory
	W0731 17:05:41.405023       1 authentication.go:368] Error looking up in-cluster authentication configuration: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication": net/http: TLS handshake timeout
	W0731 17:05:41.405046       1 authentication.go:369] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0731 17:05:41.405050       1 authentication.go:370] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0731 17:05:50.110697       1 server.go:154] "Starting Kubernetes Scheduler" version="v1.30.3"
	I0731 17:05:50.110745       1 server.go:156] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:05:50.118585       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0731 17:05:50.120054       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0731 17:05:50.120091       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0731 17:05:50.120106       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:05:50.221789       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kube-scheduler [d088fefe5f8e] <==
	E0731 17:04:26.658553       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:28.887716       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:28.887806       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:32.427417       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:32.427586       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:36.436787       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:36.436870       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:40.022061       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:40.022227       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:43.471012       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:43.471291       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:43.930296       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:43.930321       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:44.041999       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:44.042358       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:48.230649       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:48.230983       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:58.373439       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:58.373554       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:05:00.249019       1 server.go:214] "waiting for handlers to sync" err="context canceled"
	I0731 17:05:00.249450       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	I0731 17:05:00.249577       1 secure_serving.go:258] Stopped listening on 127.0.0.1:10259
	E0731 17:05:00.249641       1 shared_informer.go:316] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0731 17:05:00.249670       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	E0731 17:05:00.249984       1 run.go:74] "command failed" err="finished without leader elect"
	
	
	==> kubelet <==
	Jul 31 17:06:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:06:38 ha-393000 kubelet[1592]: I0731 17:06:38.567240    1592 scope.go:117] "RemoveContainer" containerID="6d966e37d361871f946979a92770e4f4459ed0d5ff621124310f7ec91474bd95"
	Jul 31 17:06:38 ha-393000 kubelet[1592]: I0731 17:06:38.567467    1592 scope.go:117] "RemoveContainer" containerID="c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381"
	Jul 31 17:06:38 ha-393000 kubelet[1592]: E0731 17:06:38.567576    1592 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(a59b97ca-f030-4c73-b4db-00b444d39095)\"" pod="kube-system/storage-provisioner" podUID="a59b97ca-f030-4c73-b4db-00b444d39095"
	Jul 31 17:06:50 ha-393000 kubelet[1592]: I0731 17:06:50.883078    1592 scope.go:117] "RemoveContainer" containerID="c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381"
	Jul 31 17:07:22 ha-393000 kubelet[1592]: E0731 17:07:22.903115    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:07:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:07:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:07:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:07:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:08:22 ha-393000 kubelet[1592]: E0731 17:08:22.903462    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:08:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:09:22 ha-393000 kubelet[1592]: E0731 17:09:22.903125    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:09:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:10:22 ha-393000 kubelet[1592]: E0731 17:10:22.903858    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:10:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:261: (dbg) Run:  kubectl --context ha-393000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/DegradedAfterClusterRestart FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/DegradedAfterClusterRestart (4.93s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddSecondaryNode (81.72s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddSecondaryNode
ha_test.go:605: (dbg) Run:  out/minikube-darwin-amd64 node add -p ha-393000 --control-plane -v=7 --alsologtostderr
E0731 10:11:54.504563    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
ha_test.go:605: (dbg) Done: out/minikube-darwin-amd64 node add -p ha-393000 --control-plane -v=7 --alsologtostderr: (1m16.665020561s)
ha_test.go:611: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
ha_test.go:611: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr: exit status 2 (575.367774ms)

                                                
                                                
-- stdout --
	ha-393000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-393000-m04
	type: Worker
	host: Running
	kubelet: Stopped
	
	ha-393000-m05
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:12:39.239908    3985 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:12:39.240102    3985 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:12:39.240108    3985 out.go:304] Setting ErrFile to fd 2...
	I0731 10:12:39.240111    3985 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:12:39.240285    3985 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:12:39.240480    3985 out.go:298] Setting JSON to false
	I0731 10:12:39.240504    3985 mustload.go:65] Loading cluster: ha-393000
	I0731 10:12:39.240538    3985 notify.go:220] Checking for updates...
	I0731 10:12:39.240824    3985 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:12:39.240841    3985 status.go:255] checking status of ha-393000 ...
	I0731 10:12:39.241189    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.241238    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.250013    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52399
	I0731 10:12:39.250341    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.250774    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.250784    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.250960    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.251111    3985 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:12:39.251217    3985 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:12:39.251295    3985 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:12:39.252315    3985 status.go:330] ha-393000 host status = "Running" (err=<nil>)
	I0731 10:12:39.252336    3985 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:12:39.252581    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.252607    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.261101    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52401
	I0731 10:12:39.261418    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.261760    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.261779    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.268456    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.268620    3985 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:12:39.268713    3985 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:12:39.268951    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.268971    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.277518    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52403
	I0731 10:12:39.277851    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.278171    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.278181    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.278372    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.278484    3985 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:12:39.278632    3985 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:12:39.278652    3985 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:12:39.278740    3985 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:12:39.278822    3985 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:12:39.278924    3985 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:12:39.279000    3985 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:12:39.318081    3985 ssh_runner.go:195] Run: systemctl --version
	I0731 10:12:39.322636    3985 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:12:39.335259    3985 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:12:39.335286    3985 api_server.go:166] Checking apiserver status ...
	I0731 10:12:39.335329    3985 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:12:39.347787    3985 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1940/cgroup
	W0731 10:12:39.355956    3985 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1940/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:12:39.356007    3985 ssh_runner.go:195] Run: ls
	I0731 10:12:39.359306    3985 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 10:12:39.363856    3985 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 10:12:39.363873    3985 status.go:422] ha-393000 apiserver status = Running (err=<nil>)
	I0731 10:12:39.363884    3985 status.go:257] ha-393000 status: &{Name:ha-393000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:12:39.363908    3985 status.go:255] checking status of ha-393000-m02 ...
	I0731 10:12:39.364185    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.364208    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.373078    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52407
	I0731 10:12:39.373428    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.373783    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.373801    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.374022    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.374138    3985 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:12:39.374225    3985 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:12:39.374316    3985 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3849
	I0731 10:12:39.375355    3985 status.go:330] ha-393000-m02 host status = "Running" (err=<nil>)
	I0731 10:12:39.375366    3985 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 10:12:39.375618    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.375641    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.384336    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52409
	I0731 10:12:39.384680    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.385018    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.385037    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.385249    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.385349    3985 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:12:39.385435    3985 host.go:66] Checking if "ha-393000-m02" exists ...
	I0731 10:12:39.385686    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.385711    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.394396    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52411
	I0731 10:12:39.394750    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.395072    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.395081    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.395304    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.395424    3985 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:12:39.395566    3985 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:12:39.395578    3985 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:12:39.395664    3985 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:12:39.395756    3985 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:12:39.395856    3985 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:12:39.395963    3985 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:12:39.431828    3985 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:12:39.444429    3985 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:12:39.444447    3985 api_server.go:166] Checking apiserver status ...
	I0731 10:12:39.444494    3985 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:12:39.457025    3985 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2138/cgroup
	W0731 10:12:39.465910    3985 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2138/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:12:39.465971    3985 ssh_runner.go:195] Run: ls
	I0731 10:12:39.469204    3985 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 10:12:39.472976    3985 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 10:12:39.472991    3985 status.go:422] ha-393000-m02 apiserver status = Running (err=<nil>)
	I0731 10:12:39.473000    3985 status.go:257] ha-393000-m02 status: &{Name:ha-393000-m02 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:12:39.473011    3985 status.go:255] checking status of ha-393000-m03 ...
	I0731 10:12:39.473300    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.473321    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.482048    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52415
	I0731 10:12:39.482398    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.482750    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.482766    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.482974    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.483084    3985 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:12:39.483177    3985 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:12:39.483270    3985 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 3858
	I0731 10:12:39.484397    3985 status.go:330] ha-393000-m03 host status = "Running" (err=<nil>)
	I0731 10:12:39.484408    3985 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 10:12:39.484666    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.484695    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.493412    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52417
	I0731 10:12:39.493741    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.494121    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.494139    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.494356    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.494467    3985 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:12:39.494552    3985 host.go:66] Checking if "ha-393000-m03" exists ...
	I0731 10:12:39.494847    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.494877    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.503606    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52419
	I0731 10:12:39.503953    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.504345    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.504363    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.504591    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.504719    3985 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:12:39.504877    3985 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:12:39.504890    3985 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:12:39.504976    3985 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:12:39.505061    3985 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:12:39.505147    3985 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:12:39.505258    3985 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:12:39.535794    3985 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:12:39.546490    3985 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:12:39.546505    3985 api_server.go:166] Checking apiserver status ...
	I0731 10:12:39.546550    3985 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:12:39.557106    3985 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1963/cgroup
	W0731 10:12:39.566448    3985 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1963/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:12:39.566500    3985 ssh_runner.go:195] Run: ls
	I0731 10:12:39.569509    3985 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 10:12:39.572586    3985 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 10:12:39.572601    3985 status.go:422] ha-393000-m03 apiserver status = Running (err=<nil>)
	I0731 10:12:39.572609    3985 status.go:257] ha-393000-m03 status: &{Name:ha-393000-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:12:39.572621    3985 status.go:255] checking status of ha-393000-m04 ...
	I0731 10:12:39.572905    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.572932    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.581679    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52423
	I0731 10:12:39.582033    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.582357    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.582368    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.582576    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.582689    3985 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:12:39.582778    3985 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:12:39.582867    3985 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3870
	I0731 10:12:39.583876    3985 status.go:330] ha-393000-m04 host status = "Running" (err=<nil>)
	I0731 10:12:39.583887    3985 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 10:12:39.584142    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.584168    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.592867    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52425
	I0731 10:12:39.593222    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.593541    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.593554    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.593809    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.593928    3985 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:12:39.594018    3985 host.go:66] Checking if "ha-393000-m04" exists ...
	I0731 10:12:39.594302    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.594327    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.604189    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52427
	I0731 10:12:39.604534    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.604862    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.604870    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.605086    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.605211    3985 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:12:39.605347    3985 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:12:39.605359    3985 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:12:39.605428    3985 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:12:39.605500    3985 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:12:39.605569    3985 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:12:39.605644    3985 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:12:39.638174    3985 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:12:39.650902    3985 status.go:257] ha-393000-m04 status: &{Name:ha-393000-m04 Host:Running Kubelet:Stopped APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:12:39.650929    3985 status.go:255] checking status of ha-393000-m05 ...
	I0731 10:12:39.651229    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.651252    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.660038    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52430
	I0731 10:12:39.660393    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.660763    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.660781    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.660998    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.661132    3985 main.go:141] libmachine: (ha-393000-m05) Calling .GetState
	I0731 10:12:39.661223    3985 main.go:141] libmachine: (ha-393000-m05) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:12:39.661321    3985 main.go:141] libmachine: (ha-393000-m05) DBG | hyperkit pid from json: 3963
	I0731 10:12:39.662336    3985 status.go:330] ha-393000-m05 host status = "Running" (err=<nil>)
	I0731 10:12:39.662347    3985 host.go:66] Checking if "ha-393000-m05" exists ...
	I0731 10:12:39.662609    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.662635    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.671142    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52432
	I0731 10:12:39.671472    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.671836    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.671851    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.672068    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.672180    3985 main.go:141] libmachine: (ha-393000-m05) Calling .GetIP
	I0731 10:12:39.672274    3985 host.go:66] Checking if "ha-393000-m05" exists ...
	I0731 10:12:39.672531    3985 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:12:39.672556    3985 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:12:39.681065    3985 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52434
	I0731 10:12:39.681393    3985 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:12:39.681711    3985 main.go:141] libmachine: Using API Version  1
	I0731 10:12:39.681727    3985 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:12:39.681949    3985 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:12:39.682049    3985 main.go:141] libmachine: (ha-393000-m05) Calling .DriverName
	I0731 10:12:39.682202    3985 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:12:39.682214    3985 main.go:141] libmachine: (ha-393000-m05) Calling .GetSSHHostname
	I0731 10:12:39.682308    3985 main.go:141] libmachine: (ha-393000-m05) Calling .GetSSHPort
	I0731 10:12:39.682414    3985 main.go:141] libmachine: (ha-393000-m05) Calling .GetSSHKeyPath
	I0731 10:12:39.682499    3985 main.go:141] libmachine: (ha-393000-m05) Calling .GetSSHUsername
	I0731 10:12:39.682580    3985 sshutil.go:53] new ssh client: &{IP:192.169.0.9 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m05/id_rsa Username:docker}
	I0731 10:12:39.716195    3985 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:12:39.728740    3985 kubeconfig.go:125] found "ha-393000" server: "https://192.169.0.254:8443"
	I0731 10:12:39.728754    3985 api_server.go:166] Checking apiserver status ...
	I0731 10:12:39.728793    3985 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:12:39.741747    3985 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2005/cgroup
	W0731 10:12:39.750921    3985 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2005/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:12:39.750975    3985 ssh_runner.go:195] Run: ls
	I0731 10:12:39.755188    3985 api_server.go:253] Checking apiserver healthz at https://192.169.0.254:8443/healthz ...
	I0731 10:12:39.758268    3985 api_server.go:279] https://192.169.0.254:8443/healthz returned 200:
	ok
	I0731 10:12:39.758281    3985 status.go:422] ha-393000-m05 apiserver status = Running (err=<nil>)
	I0731 10:12:39.758297    3985 status.go:257] ha-393000-m05 status: &{Name:ha-393000-m05 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:613: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:244: <<< TestMultiControlPlane/serial/AddSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/AddSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (3.514113024s)
helpers_test.go:252: TestMultiControlPlane/serial/AddSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node start m02 -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:59 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000 -v=7               | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-393000 -v=7                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT | 31 Jul 24 10:00 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	| node    | ha-393000 node delete m03 -v=7       | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | ha-393000 stop -v=7                  | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT | 31 Jul 24 10:05 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true             | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:05 PDT |                     |
	|         | -v=7 --alsologtostderr               |           |         |         |                     |                     |
	|         | --driver=hyperkit                    |           |         |         |                     |                     |
	| node    | add -p ha-393000                     | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:11 PDT | 31 Jul 24 10:12 PDT |
	|         | --control-plane -v=7                 |           |         |         |                     |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 10:05:02
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 10:05:02.368405    3827 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:05:02.368654    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368660    3827 out.go:304] Setting ErrFile to fd 2...
	I0731 10:05:02.368664    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368853    3827 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:05:02.370244    3827 out.go:298] Setting JSON to false
	I0731 10:05:02.392379    3827 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2072,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:05:02.392490    3827 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:05:02.414739    3827 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:05:02.457388    3827 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:05:02.457417    3827 notify.go:220] Checking for updates...
	I0731 10:05:02.499271    3827 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:02.520330    3827 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:05:02.541352    3827 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:05:02.562183    3827 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:05:02.583467    3827 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:05:02.605150    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:02.605829    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.605892    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.615374    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51985
	I0731 10:05:02.615746    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.616162    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.616171    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.616434    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.616563    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.616815    3827 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:05:02.617053    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.617075    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.625506    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51987
	I0731 10:05:02.625873    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.626205    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.626218    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.626409    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.626526    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.655330    3827 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:05:02.697472    3827 start.go:297] selected driver: hyperkit
	I0731 10:05:02.697517    3827 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclas
s:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersio
n:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.697705    3827 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:05:02.697830    3827 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.698011    3827 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:05:02.707355    3827 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:05:02.711327    3827 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.711347    3827 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:05:02.714056    3827 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:05:02.714115    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:02.714124    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:02.714208    3827 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.714310    3827 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.756588    3827 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:05:02.778505    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:02.778576    3827 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:05:02.778606    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:02.778797    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:02.778816    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:02.779007    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.779936    3827 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:02.780056    3827 start.go:364] duration metric: took 96.562µs to acquireMachinesLock for "ha-393000"
	I0731 10:05:02.780090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:02.780107    3827 fix.go:54] fixHost starting: 
	I0731 10:05:02.780518    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.780547    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.789537    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51989
	I0731 10:05:02.789941    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.790346    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.790360    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.790582    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.790683    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.790784    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:02.790882    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.790960    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:05:02.791917    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 3685 missing from process table
	I0731 10:05:02.791950    3827 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:05:02.791969    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:05:02.792054    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:02.834448    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:05:02.857592    3827 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:05:02.857865    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.857903    3827 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:05:02.857999    3827 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:05:02.972788    3827 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:05:02.972822    3827 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:02.973002    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973031    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973095    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:02.973143    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:02.973162    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:02.974700    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Pid is 3840
	I0731 10:05:02.975089    3827 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:05:02.975104    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.975174    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:02.977183    3827 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:05:02.977235    3827 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:02.977252    3827 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66aa6ebd}
	I0731 10:05:02.977264    3827 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:05:02.977271    3827 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:05:02.977358    3827 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:05:02.978043    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:02.978221    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.978639    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:02.978649    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.978783    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:02.978867    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:02.978959    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979081    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979169    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:02.979279    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:02.979484    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:02.979495    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:02.982358    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:03.035630    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:03.036351    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.036364    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.036371    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.036377    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.417037    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:03.417051    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:03.531673    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.531715    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.531732    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.531747    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.532606    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:03.532629    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:09.110387    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:09.110442    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:09.110451    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:09.135557    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:12.964386    3827 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:05:16.034604    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:16.034620    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034750    3827 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:05:16.034759    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034882    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.034984    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.035084    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035183    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035281    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.035421    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.035570    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.035579    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:05:16.113215    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:05:16.113236    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.113381    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.113518    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113636    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113755    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.113885    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.114075    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.114086    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:16.184090    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:16.184121    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:16.184150    3827 buildroot.go:174] setting up certificates
	I0731 10:05:16.184163    3827 provision.go:84] configureAuth start
	I0731 10:05:16.184170    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.184309    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:16.184430    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.184520    3827 provision.go:143] copyHostCerts
	I0731 10:05:16.184558    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184631    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:16.184638    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184770    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:16.184969    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185016    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:16.185020    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185099    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:16.185248    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185290    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:16.185295    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185376    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:16.185533    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:05:16.315363    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:16.315421    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:16.315435    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.315558    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.315655    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.315746    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.315837    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:16.355172    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:16.355248    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:16.374013    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:16.374082    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:05:16.392556    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:16.392614    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:16.411702    3827 provision.go:87] duration metric: took 227.524882ms to configureAuth
	I0731 10:05:16.411715    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:16.411879    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:16.411893    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:16.412059    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.412155    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.412231    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412316    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412388    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.412496    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.412621    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.412628    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:16.477022    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:16.477033    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:16.477102    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:16.477118    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.477251    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.477356    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477432    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477517    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.477641    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.477778    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.477823    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:16.554633    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:16.554652    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.554788    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.554883    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.554976    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.555060    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.555183    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.555333    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.555346    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:18.220571    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:18.220585    3827 machine.go:97] duration metric: took 15.241941013s to provisionDockerMachine
	I0731 10:05:18.220598    3827 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:05:18.220606    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:18.220616    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.220842    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:18.220863    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.220962    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.221049    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.221130    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.221229    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.266644    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:18.270380    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:18.270395    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:18.270494    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:18.270687    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:18.270693    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:18.270912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:18.279363    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:18.313374    3827 start.go:296] duration metric: took 92.765768ms for postStartSetup
	I0731 10:05:18.313403    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.313592    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:18.313611    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.313704    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.313791    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.313881    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.313968    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.352727    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:18.352783    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:18.406781    3827 fix.go:56] duration metric: took 15.626681307s for fixHost
	I0731 10:05:18.406809    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.406951    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.407051    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407152    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407242    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.407364    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:18.407503    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:18.407510    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:05:18.475125    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445518.591979627
	
	I0731 10:05:18.475138    3827 fix.go:216] guest clock: 1722445518.591979627
	I0731 10:05:18.475144    3827 fix.go:229] Guest: 2024-07-31 10:05:18.591979627 -0700 PDT Remote: 2024-07-31 10:05:18.406799 -0700 PDT m=+16.073052664 (delta=185.180627ms)
	I0731 10:05:18.475163    3827 fix.go:200] guest clock delta is within tolerance: 185.180627ms
	I0731 10:05:18.475167    3827 start.go:83] releasing machines lock for "ha-393000", held for 15.69510158s
	I0731 10:05:18.475186    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475358    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:18.475493    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475894    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476002    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476070    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:18.476101    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476134    3827 ssh_runner.go:195] Run: cat /version.json
	I0731 10:05:18.476146    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476186    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476210    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476297    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476335    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476385    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476425    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476484    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.476507    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.560719    3827 ssh_runner.go:195] Run: systemctl --version
	I0731 10:05:18.565831    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:05:18.570081    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:18.570125    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:18.582480    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:18.582493    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.582597    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.598651    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:18.607729    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:18.616451    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:18.616493    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:18.625351    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.634238    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:18.643004    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.651930    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:18.660791    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:18.669545    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:18.678319    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:18.687162    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:18.695297    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:18.703279    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:18.796523    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:18.814363    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.814439    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:18.827366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.839312    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:18.855005    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.866218    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.877621    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:18.902460    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.913828    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.928675    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:18.931574    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:18.939501    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:18.952896    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:19.047239    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:19.144409    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:19.144484    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:19.159518    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:19.256187    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:21.607075    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.350869373s)
	I0731 10:05:21.607140    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:21.618076    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:21.632059    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.642878    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:21.739846    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:21.840486    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:21.956403    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:21.971397    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.982152    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.074600    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:22.139737    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:22.139811    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:22.144307    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:22.144354    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:22.147388    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:22.177098    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:22.177167    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.195025    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.255648    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:22.255698    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:22.256066    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:22.260342    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.270020    3827 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:05:22.270145    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:22.270198    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.283427    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.283451    3827 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:05:22.283523    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.296364    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.296384    3827 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:05:22.296395    3827 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:05:22.296485    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:22.296554    3827 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:05:22.333611    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:22.333625    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:22.333642    3827 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:05:22.333657    3827 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:05:22.333735    3827 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:05:22.333754    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:22.333805    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:22.346453    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:22.346520    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:22.346575    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:22.354547    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:22.354585    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:05:22.361938    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:05:22.375252    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:22.388755    3827 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:05:22.402335    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:22.415747    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:22.418701    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.428772    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.517473    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:22.532209    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:05:22.532222    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:22.532233    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:22.532416    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:22.532495    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:22.532505    3827 certs.go:256] generating profile certs ...
	I0731 10:05:22.532617    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:22.532703    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:05:22.532784    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:22.532791    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:22.532813    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:22.532832    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:22.532850    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:22.532866    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:22.532896    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:22.532925    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:22.532949    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:22.533054    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:22.533101    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:22.533110    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:22.533142    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:22.533177    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:22.533206    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:22.533274    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:22.533306    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.533327    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.533344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.533765    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:22.562933    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:22.585645    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:22.608214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:22.634417    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:22.664309    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:22.693214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:22.749172    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:22.798119    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:22.837848    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:22.862351    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:22.887141    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:05:22.900789    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:22.904988    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:22.914154    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917542    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917577    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.921712    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:22.930986    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:22.940208    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943536    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943573    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.947845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:22.957024    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:22.965988    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969319    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969351    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.973794    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:22.982944    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:22.986290    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:22.990544    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:22.994707    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:22.999035    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:23.003364    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:23.007486    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:23.011657    3827 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:23.011769    3827 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:05:23.024287    3827 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:05:23.032627    3827 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:05:23.032639    3827 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:05:23.032681    3827 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:05:23.040731    3827 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:05:23.041056    3827 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.041141    3827 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:05:23.041332    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.041968    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.042168    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:05:23.042482    3827 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:05:23.042638    3827 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:05:23.050561    3827 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:05:23.050575    3827 kubeadm.go:597] duration metric: took 17.931942ms to restartPrimaryControlPlane
	I0731 10:05:23.050580    3827 kubeadm.go:394] duration metric: took 38.928464ms to StartCluster
	I0731 10:05:23.050588    3827 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.050661    3827 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.051035    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.051268    3827 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:23.051280    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:05:23.051290    3827 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:05:23.051393    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.095938    3827 out.go:177] * Enabled addons: 
	I0731 10:05:23.116914    3827 addons.go:510] duration metric: took 65.60253ms for enable addons: enabled=[]
	I0731 10:05:23.116954    3827 start.go:246] waiting for cluster config update ...
	I0731 10:05:23.116965    3827 start.go:255] writing updated cluster config ...
	I0731 10:05:23.138605    3827 out.go:177] 
	I0731 10:05:23.160466    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.160597    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.182983    3827 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:05:23.224869    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:23.224904    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:23.225104    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:23.225125    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:23.225250    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.226256    3827 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:23.226360    3827 start.go:364] duration metric: took 80.549µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:05:23.226385    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:23.226394    3827 fix.go:54] fixHost starting: m02
	I0731 10:05:23.226804    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:23.226838    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:23.236394    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52012
	I0731 10:05:23.236756    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:23.237106    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:23.237125    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:23.237342    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:23.237473    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.237574    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:05:23.237669    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.237738    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:05:23.238671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.238732    3827 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:05:23.238750    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:05:23.238834    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:23.260015    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:05:23.302032    3827 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:05:23.302368    3827 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:05:23.302393    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.304220    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.304235    3827 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3703 is in state "Stopped"
	I0731 10:05:23.304257    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:05:23.304590    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:05:23.331752    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:05:23.331774    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:23.331901    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331928    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331992    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:23.332030    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:23.332051    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:23.333566    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Pid is 3849
	I0731 10:05:23.333951    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:05:23.333966    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.334032    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3849
	I0731 10:05:23.335680    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:05:23.335745    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:23.335779    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:05:23.335790    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbf52}
	I0731 10:05:23.335796    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:05:23.335803    3827 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:05:23.335842    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:05:23.336526    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:23.336703    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.337199    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:23.337210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.337324    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:23.337431    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:23.337536    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337761    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:23.337898    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:23.338051    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:23.338058    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:23.341501    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:23.350236    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:23.351301    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.351321    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.351333    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.351364    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.736116    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:23.736132    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:23.851173    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.851191    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.851204    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.851217    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.852083    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:23.852399    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:29.408102    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:29.408171    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:29.408180    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:29.431671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:34.400446    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:34.400461    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400584    3827 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:05:34.400595    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400705    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.400796    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.400890    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.400963    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.401039    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.401181    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.401327    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.401336    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:05:34.470038    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:05:34.470053    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.470199    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.470327    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470407    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470489    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.470615    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.470762    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.470773    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:34.535872    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:34.535890    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:34.535899    3827 buildroot.go:174] setting up certificates
	I0731 10:05:34.535905    3827 provision.go:84] configureAuth start
	I0731 10:05:34.535911    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.536042    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:34.536141    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.536239    3827 provision.go:143] copyHostCerts
	I0731 10:05:34.536274    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536323    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:34.536328    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536441    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:34.536669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536701    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:34.536706    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536812    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:34.536958    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.536987    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:34.536992    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.537061    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:34.537222    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:05:34.648982    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:34.649040    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:34.649057    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.649198    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.649295    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.649402    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.649489    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:34.683701    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:34.683772    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:34.703525    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:34.703596    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:34.722548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:34.722624    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:05:34.742309    3827 provision.go:87] duration metric: took 206.391288ms to configureAuth
	I0731 10:05:34.742322    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:34.742483    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:34.742496    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:34.742630    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.742723    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.742814    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742903    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742982    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.743099    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.743260    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.743269    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:34.800092    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:34.800106    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:34.800191    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:34.800203    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.800330    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.800415    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800506    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800591    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.800702    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.800838    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.800885    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:34.869190    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:34.869210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.869342    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.869439    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869544    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869626    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.869780    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.869920    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.869935    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:36.520454    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:36.520469    3827 machine.go:97] duration metric: took 13.183263325s to provisionDockerMachine
	I0731 10:05:36.520479    3827 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:05:36.520499    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:36.520508    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.520691    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:36.520702    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.520789    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.520884    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.520979    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.521066    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.561300    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:36.564926    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:36.564938    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:36.565027    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:36.565170    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:36.565176    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:36.565342    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:36.574123    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:36.603284    3827 start.go:296] duration metric: took 82.788869ms for postStartSetup
	I0731 10:05:36.603307    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.603494    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:36.603509    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.603613    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.603706    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.603803    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.603903    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.639240    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:36.639297    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:36.692559    3827 fix.go:56] duration metric: took 13.466165097s for fixHost
	I0731 10:05:36.692585    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.692728    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.692817    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692901    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692991    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.693111    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:36.693255    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:36.693263    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:05:36.752606    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445536.868457526
	
	I0731 10:05:36.752619    3827 fix.go:216] guest clock: 1722445536.868457526
	I0731 10:05:36.752626    3827 fix.go:229] Guest: 2024-07-31 10:05:36.868457526 -0700 PDT Remote: 2024-07-31 10:05:36.692574 -0700 PDT m=+34.358830009 (delta=175.883526ms)
	I0731 10:05:36.752636    3827 fix.go:200] guest clock delta is within tolerance: 175.883526ms
	I0731 10:05:36.752640    3827 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.526270601s
	I0731 10:05:36.752657    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.752793    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:36.777379    3827 out.go:177] * Found network options:
	I0731 10:05:36.798039    3827 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:05:36.819503    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.819540    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820385    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820770    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:36.820818    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:05:36.820878    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.820996    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:05:36.821009    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821024    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.821247    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821250    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821474    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821525    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821664    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.821739    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821918    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:05:36.854335    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:36.854406    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:36.901302    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:36.901324    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:36.901422    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:36.917770    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:36.926621    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:36.935218    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:36.935259    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:36.943879    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.952873    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:36.961710    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.970281    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:36.979176    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:36.987922    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:36.996548    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:37.005349    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:37.013281    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:37.020977    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.118458    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:37.137862    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:37.137937    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:37.153588    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.167668    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:37.181903    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.192106    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.202268    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:37.223314    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.233629    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:37.248658    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:37.251547    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:37.258758    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:37.272146    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:37.371218    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:37.472623    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:37.472648    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:37.486639    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.587113    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:39.947283    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.360151257s)
	I0731 10:05:39.947347    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:39.958391    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:39.972060    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:39.983040    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:40.085475    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:40.202062    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.302654    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:40.316209    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:40.326252    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.418074    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:40.482758    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:40.482836    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:40.487561    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:40.487613    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:40.491035    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:40.518347    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:40.518420    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.537051    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.576384    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:40.597853    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:05:40.618716    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:40.618993    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:40.622501    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:40.631917    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:05:40.632085    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:40.632302    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.632324    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.640887    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52034
	I0731 10:05:40.641227    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.641546    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.641557    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.641784    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.641900    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:40.641993    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:40.642069    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:40.643035    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:05:40.643318    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.643340    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.651868    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52036
	I0731 10:05:40.652209    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.652562    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.652581    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.652781    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.652890    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:40.652982    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 10:05:40.652988    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:40.653003    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:40.653135    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:40.653190    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:40.653199    3827 certs.go:256] generating profile certs ...
	I0731 10:05:40.653301    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:40.653388    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.59c17652
	I0731 10:05:40.653436    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:40.653443    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:40.653468    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:40.653489    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:40.653510    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:40.653529    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:40.653548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:40.653566    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:40.653584    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:40.653667    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:40.653713    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:40.653722    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:40.653755    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:40.653790    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:40.653819    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:40.653897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:40.653931    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:40.653957    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:40.653976    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:40.654001    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:40.654103    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:40.654205    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:40.654295    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:40.654382    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:40.686134    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 10:05:40.689771    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:05:40.697866    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 10:05:40.700957    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:05:40.708798    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:05:40.711973    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:05:40.719794    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:05:40.722937    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:05:40.731558    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:05:40.734708    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:05:40.742535    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 10:05:40.745692    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:05:40.753969    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:40.774721    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:40.793621    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:40.813481    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:40.833191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:40.853099    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:40.872942    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:40.892952    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:40.912690    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:40.932438    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:40.952459    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:40.971059    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:05:40.984708    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:05:40.998235    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:05:41.011745    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:05:41.025144    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:05:41.038794    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:05:41.052449    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:05:41.066415    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:41.070679    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:41.078894    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082206    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082237    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.086362    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:41.094634    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:41.103040    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106511    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106559    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.110939    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:41.119202    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:41.127421    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130783    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.134845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:41.142958    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:41.146291    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:41.150662    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:41.154843    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:41.159061    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:41.163240    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:41.167541    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:41.171729    3827 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 10:05:41.171784    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:41.171806    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:41.171838    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:41.184093    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:41.184125    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:41.184181    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:41.191780    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:41.191825    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:05:41.199155    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:05:41.212419    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:41.225964    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:41.239859    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:41.242661    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:41.251855    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.345266    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.360525    3827 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:41.360751    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:41.382214    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:05:41.402932    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.525126    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.539502    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:41.539699    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:05:41.539742    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:05:41.539934    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:41.540009    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:41.540015    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:41.540022    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:41.540026    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.017427    3827 round_trippers.go:574] Response Status: 200 OK in 8477 milliseconds
	I0731 10:05:50.018648    3827 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 10:05:50.018662    3827 node_ready.go:38] duration metric: took 8.478709659s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:50.018668    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:05:50.018717    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:05:50.018723    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.018731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.018737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.028704    3827 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 10:05:50.043501    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.043562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:05:50.043568    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.043574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.043579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.049258    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.050015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.050025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.050031    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.050035    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.066794    3827 round_trippers.go:574] Response Status: 200 OK in 16 milliseconds
	I0731 10:05:50.067093    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.067103    3827 pod_ready.go:81] duration metric: took 23.584491ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067110    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067150    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:05:50.067155    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.067161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.067170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.072229    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.072653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.072662    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.072674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.072678    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076158    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.076475    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.076487    3827 pod_ready.go:81] duration metric: took 9.372147ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076494    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076536    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:05:50.076541    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.076547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076551    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079467    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.079849    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.079858    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.079866    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079871    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.086323    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:05:50.086764    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.086775    3827 pod_ready.go:81] duration metric: took 10.276448ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086782    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:05:50.086846    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.086852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.086861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.090747    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.091293    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:50.091301    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.091306    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.091310    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.093538    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.094155    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.094165    3827 pod_ready.go:81] duration metric: took 7.376399ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094171    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:05:50.094214    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.094220    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.094223    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.096892    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.219826    3827 request.go:629] Waited for 122.388601ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219867    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219876    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.219882    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.219887    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.222303    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.222701    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.222710    3827 pod_ready.go:81] duration metric: took 128.533092ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.222720    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.419341    3827 request.go:629] Waited for 196.517978ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419372    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419376    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.419382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.419386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.424561    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.619242    3827 request.go:629] Waited for 194.143472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619333    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619339    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.619346    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.619350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.622245    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.622550    3827 pod_ready.go:97] node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622563    3827 pod_ready.go:81] duration metric: took 399.836525ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	E0731 10:05:50.622570    3827 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622575    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.819353    3827 request.go:629] Waited for 196.739442ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819433    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.819438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.819447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.822809    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:51.019387    3827 request.go:629] Waited for 196.0195ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019480    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.019488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.019494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.021643    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.220184    3827 request.go:629] Waited for 96.247837ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220254    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220260    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.220266    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.220271    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.222468    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.419702    3827 request.go:629] Waited for 196.732028ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419735    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419739    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.419746    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.419749    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.422018    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.622851    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.622865    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.622870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.622873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.625570    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.818923    3827 request.go:629] Waited for 192.647007ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818971    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.818977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.818981    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.821253    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.123108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.123124    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.123133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.123137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.125336    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.220188    3827 request.go:629] Waited for 94.282602ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220295    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220306    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.220317    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.220325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.223136    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.623123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.623202    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.623217    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.623227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.626259    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:52.626893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.626903    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.626912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.626916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.628416    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:52.628799    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:53.124413    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.124432    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.124441    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.124446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.127045    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.127494    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.127501    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.127511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.127514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.129223    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:53.623065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.623121    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.623133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.623142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626047    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.626707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.626717    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.626725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626729    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.628447    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:54.123646    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.123761    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.123778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.123788    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.127286    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:54.128015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.128025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.128033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.128038    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.130101    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.623229    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.623244    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.623253    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.623266    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.625325    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.625780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.625788    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.625794    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.625798    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.627218    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.123298    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.123318    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.123329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.123334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.126495    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:55.127199    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.127207    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.127213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.127217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.128585    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.128968    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:55.623994    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.624008    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.624016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.624021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.626813    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:55.627329    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.627336    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.627342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.627345    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.628805    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.123118    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.123195    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.123210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.123231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.126276    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:56.126864    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.126872    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.126877    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.126881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.128479    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.623814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.623924    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.623942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.623953    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.626841    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:56.627450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.627457    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.627463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.627467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.628844    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.124173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.124250    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.124262    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.124287    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.127734    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:57.128370    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.128377    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.128383    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.128386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.130108    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.130481    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:57.624004    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.624033    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.624093    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.624103    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.627095    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:57.628522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.628533    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.628541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.628547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.630446    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.123493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.123505    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.123512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.123514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.125506    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.126108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.126116    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.126121    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.126124    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.127991    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.623114    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.623141    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.623216    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.626428    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:58.627173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.627181    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.627187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.627191    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.628749    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.123212    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.123231    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.123243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.123249    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.126584    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:59.127100    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.127110    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.127118    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.127123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.129080    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.624707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.624736    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.624808    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.624814    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.627710    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:59.628543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.628550    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.628556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.628560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.630077    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.630437    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:00.123863    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.123878    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.123885    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.123888    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.125761    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.126237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.126245    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.126251    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.126254    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.127937    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.623226    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.623240    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.623246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.623249    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625210    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.625691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.625699    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.625704    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.627280    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.124705    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.124804    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.124820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.124830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.127445    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.127933    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.127941    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.127947    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.127950    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.129462    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.623718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.623731    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.623736    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.623739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.625948    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.626336    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.626344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.626349    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.626352    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.627901    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.124021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.124081    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.124088    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.124092    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.125801    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.126187    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.126195    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.126200    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.126204    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.127656    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.127974    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:02.623206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.623222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.623232    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.626774    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:02.627381    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.627389    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.627395    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.627400    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.630037    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.122889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.122980    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.122991    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.122997    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.125539    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.125964    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.125972    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.125976    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.125991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.129847    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.623340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.623368    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.623379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.623386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.626892    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.627517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.627524    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.627530    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.627532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.629281    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.123967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:04.124007    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.124016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.124021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.126604    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.127104    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.127111    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.127116    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.127131    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.128806    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.129260    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.129268    3827 pod_ready.go:81] duration metric: took 13.506690115s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129277    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129312    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:04.129317    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.129323    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.129328    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.131506    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.131966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.131974    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.131980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.131984    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.133464    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.133963    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.133974    3827 pod_ready.go:81] duration metric: took 4.690553ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.133981    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.134013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:04.134018    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.134023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.134028    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.136093    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.136498    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:04.136506    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.136512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.136515    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.138480    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.138864    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.138874    3827 pod_ready.go:81] duration metric: took 4.887644ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138882    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138917    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:04.138922    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.138928    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.138932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.140760    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.141121    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.141129    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.141134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.141137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.143127    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.143455    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.143464    3827 pod_ready.go:81] duration metric: took 4.577275ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143471    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:04.143513    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.143519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.143523    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.145638    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.145987    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.145994    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.146000    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.146003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.147718    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.148046    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.148055    3827 pod_ready.go:81] duration metric: took 4.578507ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.148061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.325414    3827 request.go:629] Waited for 177.298505ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325544    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.325555    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.325563    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.328825    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.525753    3827 request.go:629] Waited for 196.338568ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.525828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.525836    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.529114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.529604    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.529616    3827 pod_ready.go:81] duration metric: took 381.550005ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.529625    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.724886    3827 request.go:629] Waited for 195.165832ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724931    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.724937    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.724942    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.726934    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.924942    3827 request.go:629] Waited for 197.623557ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924972    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924977    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.924984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.924987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.927056    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.927556    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.927566    3827 pod_ready.go:81] duration metric: took 397.934888ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.927572    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.124719    3827 request.go:629] Waited for 197.081968ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124767    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.124774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.124777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.126705    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.324036    3827 request.go:629] Waited for 196.854241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324136    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.324144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.324151    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.326450    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:05.326831    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.326840    3827 pod_ready.go:81] duration metric: took 399.263993ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.326854    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.525444    3827 request.go:629] Waited for 198.543186ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525484    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.525490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.525494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.527459    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.724382    3827 request.go:629] Waited for 196.465154ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724505    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.724516    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.724528    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.727650    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:05.728134    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.728147    3827 pod_ready.go:81] duration metric: took 401.285988ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.728155    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.925067    3827 request.go:629] Waited for 196.808438ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.925137    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.925147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.928198    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.125772    3827 request.go:629] Waited for 196.79397ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125895    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125907    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.125918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.125924    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.129114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.129535    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.129548    3827 pod_ready.go:81] duration metric: took 401.386083ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.129557    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.324601    3827 request.go:629] Waited for 194.995432ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.324729    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.324736    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.327699    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.524056    3827 request.go:629] Waited for 195.918056ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524164    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524175    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.524186    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.524192    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.527800    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.528245    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.528255    3827 pod_ready.go:81] duration metric: took 398.692914ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.528262    3827 pod_ready.go:38] duration metric: took 16.509588377s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:06.528282    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:06.528341    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:06.541572    3827 api_server.go:72] duration metric: took 25.181024878s to wait for apiserver process to appear ...
	I0731 10:06:06.541584    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:06.541605    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:06.544968    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:06.545011    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:06.545016    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.545023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.545027    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.545730    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:06.545799    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:06.545808    3827 api_server.go:131] duration metric: took 4.219553ms to wait for apiserver health ...
	I0731 10:06:06.545813    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:06.724899    3827 request.go:629] Waited for 179.053526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724936    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.724948    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.724951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.733411    3827 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 10:06:06.742910    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:06.742937    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:06.742945    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:06.742950    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:06.742953    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:06.742958    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:06.742961    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:06.742963    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:06.742966    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:06.742968    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:06.742971    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:06.742973    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:06.742977    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:06.742981    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:06.742984    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:06.742986    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:06.742989    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:06.742991    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:06.742995    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:06.742998    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:06.743001    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:06.743003    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Pending
	I0731 10:06:06.743006    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:06.743010    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:06.743012    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:06.743017    3827 system_pods.go:74] duration metric: took 197.200154ms to wait for pod list to return data ...
	I0731 10:06:06.743023    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:06.925020    3827 request.go:629] Waited for 181.949734ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925067    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.925076    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.925081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.927535    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.927730    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:06.927740    3827 default_sa.go:55] duration metric: took 184.712762ms for default service account to be created ...
	I0731 10:06:06.927745    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:07.125051    3827 request.go:629] Waited for 197.272072ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125090    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.125100    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.125104    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.129975    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:07.134630    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:07.134648    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134654    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134659    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:07.134663    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:07.134666    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:07.134671    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running / Ready:ContainersNotReady (containers with unready status: [kindnet-cni]) / ContainersReady:ContainersNotReady (containers with unready status: [kindnet-cni])
	I0731 10:06:07.134675    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:07.134679    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:07.134683    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:07.134705    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:07.134712    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:07.134718    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0731 10:06:07.134723    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:07.134728    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:07.134731    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:07.134735    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:07.134739    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0731 10:06:07.134743    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:07.134747    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:07.134751    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:07.134755    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:07.134764    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:07.134768    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:07.134772    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0731 10:06:07.134781    3827 system_pods.go:126] duration metric: took 207.030567ms to wait for k8s-apps to be running ...
	I0731 10:06:07.134786    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:07.134841    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:07.148198    3827 system_svc.go:56] duration metric: took 13.406485ms WaitForService to wait for kubelet
	I0731 10:06:07.148215    3827 kubeadm.go:582] duration metric: took 25.78766951s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:07.148230    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:07.324197    3827 request.go:629] Waited for 175.905806ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324232    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.324238    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.324243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.329946    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:07.330815    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330830    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330840    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330843    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330847    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330850    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330853    3827 node_conditions.go:105] duration metric: took 182.619551ms to run NodePressure ...
	I0731 10:06:07.330860    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:07.330878    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:07.352309    3827 out.go:177] 
	I0731 10:06:07.373528    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:07.373631    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.433500    3827 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 10:06:07.475236    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:07.475262    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:07.475398    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:07.475412    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:07.475498    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.476024    3827 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:07.476077    3827 start.go:364] duration metric: took 40.57µs to acquireMachinesLock for "ha-393000-m03"
	I0731 10:06:07.476090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:07.476095    3827 fix.go:54] fixHost starting: m03
	I0731 10:06:07.476337    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:07.476357    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:07.485700    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52041
	I0731 10:06:07.486069    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:07.486427    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:07.486449    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:07.486677    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:07.486797    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.486888    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:06:07.486969    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.487057    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:06:07.488010    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.488031    3827 fix.go:112] recreateIfNeeded on ha-393000-m03: state=Stopped err=<nil>
	I0731 10:06:07.488039    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	W0731 10:06:07.488129    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:07.525270    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m03" ...
	I0731 10:06:07.583189    3827 main.go:141] libmachine: (ha-393000-m03) Calling .Start
	I0731 10:06:07.583357    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.583398    3827 main.go:141] libmachine: (ha-393000-m03) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 10:06:07.584444    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.584457    3827 main.go:141] libmachine: (ha-393000-m03) DBG | pid 2994 is in state "Stopped"
	I0731 10:06:07.584473    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid...
	I0731 10:06:07.584622    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 10:06:07.614491    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 10:06:07.614519    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:07.614662    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614709    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614792    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:07.614841    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:07.614865    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:07.616508    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Pid is 3858
	I0731 10:06:07.617000    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 10:06:07.617017    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.617185    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 3858
	I0731 10:06:07.619558    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 10:06:07.619621    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:07.619647    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:07.619664    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:07.619685    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:07.619703    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:06:07.619712    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 10:06:07.619727    3827 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 10:06:07.619755    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 10:06:07.620809    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:07.621055    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.621590    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:07.621602    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.621745    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:07.621861    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:07.621957    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622061    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622150    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:07.622290    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:07.622460    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:07.622469    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:07.625744    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:07.635188    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:07.636453    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:07.636476    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:07.636488    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:07.636503    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.026194    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:08.026210    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:08.141380    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:08.141403    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:08.141420    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:08.141430    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.142228    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:08.142237    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:13.717443    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:13.717596    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:13.717612    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:13.741129    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:06:18.682578    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:06:18.682599    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682767    3827 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 10:06:18.682779    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682866    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.682981    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.683070    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683166    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683267    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.683412    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.683571    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.683581    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 10:06:18.749045    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 10:06:18.749064    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.749190    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.749278    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749369    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.749565    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.749706    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.749722    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:06:18.806865    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:06:18.806883    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:06:18.806892    3827 buildroot.go:174] setting up certificates
	I0731 10:06:18.806898    3827 provision.go:84] configureAuth start
	I0731 10:06:18.806904    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.807035    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:18.807129    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.807209    3827 provision.go:143] copyHostCerts
	I0731 10:06:18.807236    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807287    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:06:18.807293    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807440    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:06:18.807654    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807687    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:06:18.807691    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807798    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:06:18.807946    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.807978    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:06:18.807983    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.808051    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:06:18.808199    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 10:06:18.849388    3827 provision.go:177] copyRemoteCerts
	I0731 10:06:18.849440    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:06:18.849454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.849608    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.849706    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.849793    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.849878    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:18.882927    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:06:18.883001    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:06:18.902836    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:06:18.902904    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:06:18.922711    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:06:18.922778    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:06:18.943709    3827 provision.go:87] duration metric: took 136.803232ms to configureAuth
	I0731 10:06:18.943724    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:06:18.943896    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:18.943910    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:18.944075    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.944168    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.944245    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944342    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944422    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.944538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.944665    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.944672    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:06:18.996744    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:06:18.996756    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:06:18.996829    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:06:18.996840    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.996972    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.997082    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997171    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997252    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.997394    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.997538    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.997587    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:06:19.061774    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:06:19.061792    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:19.061924    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:19.062001    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062094    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062183    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:19.062322    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:19.062475    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:19.062487    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:06:20.667693    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:06:20.667709    3827 machine.go:97] duration metric: took 13.046112735s to provisionDockerMachine
	I0731 10:06:20.667718    3827 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 10:06:20.667725    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:06:20.667738    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.667939    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:06:20.667954    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.668063    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.668167    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.668260    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.668365    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.711043    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:06:20.714520    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:06:20.714533    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:06:20.714632    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:06:20.714782    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:06:20.714789    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:06:20.714971    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:06:20.725237    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:20.756197    3827 start.go:296] duration metric: took 88.463878ms for postStartSetup
	I0731 10:06:20.756221    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.756402    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:06:20.756417    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.756509    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.756594    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.756688    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.756757    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.788829    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:06:20.788889    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:06:20.841715    3827 fix.go:56] duration metric: took 13.365618842s for fixHost
	I0731 10:06:20.841743    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.841878    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.841982    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842069    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842155    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.842314    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:20.842486    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:20.842494    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:06:20.895743    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445580.896263750
	
	I0731 10:06:20.895763    3827 fix.go:216] guest clock: 1722445580.896263750
	I0731 10:06:20.895768    3827 fix.go:229] Guest: 2024-07-31 10:06:20.89626375 -0700 PDT Remote: 2024-07-31 10:06:20.841731 -0700 PDT m=+78.507993684 (delta=54.53275ms)
	I0731 10:06:20.895779    3827 fix.go:200] guest clock delta is within tolerance: 54.53275ms
	I0731 10:06:20.895783    3827 start.go:83] releasing machines lock for "ha-393000-m03", held for 13.419701289s
	I0731 10:06:20.895800    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.895930    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:20.933794    3827 out.go:177] * Found network options:
	I0731 10:06:21.008361    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 10:06:21.029193    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.029220    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.029239    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.029902    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030149    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030274    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:06:21.030303    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 10:06:21.030372    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.030402    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.030458    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030487    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:06:21.030508    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:21.030615    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030657    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030724    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030782    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030837    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030887    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:21.030941    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 10:06:21.060481    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:06:21.060548    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:06:21.113024    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:06:21.113039    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.113103    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.128523    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:06:21.136837    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:06:21.145325    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.145388    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:06:21.153686    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.162021    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:06:21.170104    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.178345    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:06:21.186720    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:06:21.195003    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:06:21.203212    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:06:21.211700    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:06:21.219303    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:06:21.226730    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.333036    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:06:21.355400    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.355468    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:06:21.370793    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.382599    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:06:21.397116    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.408366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.419500    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:06:21.441593    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.453210    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.468638    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:06:21.471686    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:06:21.480107    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:06:21.493473    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:06:21.590098    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:06:21.695002    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.695025    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:06:21.709644    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.804799    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:06:24.090859    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.286034061s)
	I0731 10:06:24.090921    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:06:24.102085    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:06:24.115631    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.125950    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:06:24.222193    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:06:24.332843    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.449689    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:06:24.463232    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.474652    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.567486    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:06:24.631150    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:06:24.631230    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:06:24.635708    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:06:24.635764    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:06:24.638929    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:06:24.666470    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:06:24.666542    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.686587    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.729344    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:06:24.771251    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:06:24.792172    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:06:24.813314    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:24.813703    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:06:24.818215    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:24.828147    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:06:24.828324    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:24.828531    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.828552    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.837259    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52063
	I0731 10:06:24.837609    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.837954    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.837967    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.838165    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.838272    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:06:24.838349    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:24.838424    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:06:24.839404    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:06:24.839647    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.839672    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.848293    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52065
	I0731 10:06:24.848630    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.848982    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.848999    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.849191    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.849297    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:06:24.849393    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 10:06:24.849399    3827 certs.go:194] generating shared ca certs ...
	I0731 10:06:24.849408    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:06:24.849551    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:06:24.849606    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:06:24.849615    3827 certs.go:256] generating profile certs ...
	I0731 10:06:24.849710    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:06:24.849799    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 10:06:24.849848    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:06:24.849860    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:06:24.849881    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:06:24.849901    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:06:24.849920    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:06:24.849937    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:06:24.849955    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:06:24.849974    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:06:24.849991    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:06:24.850072    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:06:24.850109    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:06:24.850118    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:06:24.850152    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:06:24.850184    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:06:24.850218    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:06:24.850285    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:24.850322    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:06:24.850344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:06:24.850366    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:24.850395    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:06:24.850485    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:06:24.850565    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:06:24.850653    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:06:24.850732    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:06:24.882529    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 10:06:24.886785    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:06:24.896598    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 10:06:24.900384    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:06:24.910269    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:06:24.914011    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:06:24.922532    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:06:24.925784    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:06:24.936850    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:06:24.940321    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:06:24.950026    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 10:06:24.953055    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:06:24.962295    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:06:24.982990    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:06:25.003016    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:06:25.022822    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:06:25.043864    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:06:25.064140    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:06:25.084546    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:06:25.105394    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:06:25.125890    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:06:25.146532    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:06:25.166742    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:06:25.186545    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:06:25.200206    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:06:25.214106    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:06:25.228037    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:06:25.242065    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:06:25.255847    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:06:25.269574    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:06:25.283881    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:06:25.288466    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:06:25.297630    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301289    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301331    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.305714    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:06:25.314348    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:06:25.322967    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326578    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326634    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.330926    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:06:25.339498    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:06:25.348151    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351535    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351576    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.355921    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:06:25.364535    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:06:25.368077    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:06:25.372428    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:06:25.376757    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:06:25.380980    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:06:25.385296    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:06:25.389606    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:06:25.393857    3827 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 10:06:25.393914    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:06:25.393928    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:06:25.393959    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:06:25.405786    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:06:25.405830    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:06:25.405888    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:06:25.414334    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:06:25.414379    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:06:25.422310    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:06:25.435970    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:06:25.449652    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:06:25.463392    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:06:25.466266    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:25.476391    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.572265    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.587266    3827 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:06:25.587454    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:25.609105    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:06:25.650600    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.776520    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.790838    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:06:25.791048    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:06:25.791095    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:06:25.791257    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.791299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:25.791305    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.791311    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.791315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.793351    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.793683    3827 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 10:06:25.793693    3827 node_ready.go:38] duration metric: took 2.426331ms for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.793700    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:25.793737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:25.793742    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.793753    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.793758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.797877    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:25.803934    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:25.803995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:25.804000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.804007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.804011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.806477    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.806997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:25.807005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.807011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.807014    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.808989    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.304983    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.304998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.305006    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.305010    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.307209    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:26.307839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.307846    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.307852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.307861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.309644    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.805493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.805510    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.805520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.805527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.821394    3827 round_trippers.go:574] Response Status: 200 OK in 15 milliseconds
	I0731 10:06:26.822205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.822215    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.822221    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.822224    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.827160    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:27.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.305839    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.305846    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308258    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.308744    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.308752    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.308758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308761    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.310974    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.805552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.805567    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.805574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.805578    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.807860    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.808403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.808410    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.808416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.808419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.810436    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.810811    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:28.305577    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.305593    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.305600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.305604    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.311583    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:28.312446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.312455    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.312461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.312465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.314779    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.804391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.804407    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.804414    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.804420    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.806848    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.807227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.807235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.807241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.807244    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.809171    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:29.305552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.305615    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.305624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.305629    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.308134    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.308891    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.308900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.308906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.308909    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.311098    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.805109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.805127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.805192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.805198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.807898    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.808285    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.808292    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.808297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.808300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.810154    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.305017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.305032    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.305045    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.305048    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.307205    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.307776    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.307783    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.307789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.307792    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.309771    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.310293    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:30.805366    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.805428    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.805436    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.805440    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.807864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.808309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.808316    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.808322    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.808325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.810111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.305667    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.305700    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.305708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.305712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308126    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:31.308539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.308546    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.308552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.310279    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.804975    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.805002    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.805014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.805020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.808534    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:31.809053    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.809061    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.809066    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.809069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.810955    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.304759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.304815    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.304830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.304839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.308267    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.308684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.308692    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.308698    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.308701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.310475    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.310804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:32.805138    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.805163    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.805175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.805181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.808419    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.809125    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.809133    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.809139    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.809143    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.810741    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.305088    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.305103    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.305109    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.305113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.307495    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.307998    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.308005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.308011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.308015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.309595    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.806000    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.806021    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.806049    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.806056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.808625    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.809248    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.809259    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.809264    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.809269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.810758    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.305752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.305832    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.305847    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.305853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.308868    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.309591    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.309599    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.309605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.309608    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.311263    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.311627    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:34.804923    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.804948    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.804959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.804965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.808036    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.808636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.808646    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.808654    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.808670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.810398    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.305879    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.305966    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.305982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.305991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.309016    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:35.309584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.309592    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.309598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.309601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.311155    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.804092    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.804107    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.804114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.804117    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.806476    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:35.806988    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.806997    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.807002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.807025    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.808897    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.305921    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.305943    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.305951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.305955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.308670    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:36.309170    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.309178    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.309184    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.309199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.310943    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.805015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.805085    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.805098    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.805106    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.808215    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:36.808810    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.808817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.808823    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.808827    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.810482    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.810768    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:37.305031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.305055    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.305068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.305077    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.308209    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:37.308934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.308942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.308947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.308951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.310514    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:37.805625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.805671    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.805682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.805687    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808188    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:37.808728    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.808735    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.808741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808744    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.810288    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.305838    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.305845    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.307926    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.308378    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.308386    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.308391    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.308395    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.310092    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.805380    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.805397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.805406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.805410    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.807819    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.808368    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.808376    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.808382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.808385    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.809904    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.305804    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.305820    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.305826    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.305830    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.307991    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.308527    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.308535    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.308541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.308546    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.310495    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.310929    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:39.806108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.806122    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.806129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.806132    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.808192    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.808709    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.808718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.808727    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.808730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.810476    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.304101    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.304125    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.304137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.304144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307004    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.307629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.307637    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.307643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.309373    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.804289    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.804302    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.804329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.804334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.806678    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.807320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.807328    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.807334    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.807338    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.809111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.305710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.305762    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.305770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.305774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.307795    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.308244    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.308252    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.308258    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.310033    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.805219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.805235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.805242    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.805246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.807574    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.808103    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.808112    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.808119    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.808123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.810305    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.810720    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:42.305509    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.305569    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.305580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.305586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.307774    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:42.308154    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.308161    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.308167    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.308170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.309895    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:42.804631    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.804655    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.804667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.804687    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.808080    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:42.808852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.808863    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.808869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.808874    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.811059    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.304116    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.304217    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.304233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.304239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.306879    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.307340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.307348    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.307354    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.307358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.308948    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.805920    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.805934    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.805981    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.805986    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.808009    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.808576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.808583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.808589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.808592    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.810282    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.810804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:44.304703    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.304728    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.304798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.304823    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.308376    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.308780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.308787    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.308793    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.308797    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.310396    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:44.805218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.805242    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.805255    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.805264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.808404    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.808967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.808978    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.808986    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.808990    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.810748    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.304672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.304770    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.304784    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.304791    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.307754    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:45.308249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.308256    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.308265    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.309903    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.804236    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.804265    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.804276    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.804281    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.807605    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:45.808214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.808222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.808228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.808231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.810076    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.305660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.305674    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.305723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.305727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.307959    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.308389    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.308397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.308403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.308406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.310188    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.310668    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:46.805585    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.805685    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.805700    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.805708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.808399    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.808892    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.808900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.808910    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.808914    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.810397    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.304911    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:47.304926    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.304933    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.304936    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.307282    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.307761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.307768    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.307774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.307777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.309541    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.309921    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.309931    3827 pod_ready.go:81] duration metric: took 21.505983976s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309937    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:06:47.309971    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.309977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.309980    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.311547    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.311995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.312003    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.312009    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.312013    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.313414    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.313802    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.313811    3827 pod_ready.go:81] duration metric: took 3.869093ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313818    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313850    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:06:47.313855    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.313861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.313865    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.315523    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.315938    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.315947    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.315955    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.315959    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.317522    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.317922    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.317931    3827 pod_ready.go:81] duration metric: took 4.10711ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317937    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:06:47.317976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.317982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.317985    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319520    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.319893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:47.319900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.319906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319909    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321439    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.321816    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.321825    3827 pod_ready.go:81] duration metric: took 3.88293ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321832    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321862    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:06:47.321867    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.321872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321876    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.323407    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.323756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:47.323763    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.323769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.323773    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.325384    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.325703    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.325712    3827 pod_ready.go:81] duration metric: took 3.875112ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.325727    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.505410    3827 request.go:629] Waited for 179.649549ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505454    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.505462    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.505467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.508003    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.705861    3827 request.go:629] Waited for 197.38651ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.705987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.705997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.708863    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.709477    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.709486    3827 pod_ready.go:81] duration metric: took 383.754198ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.709493    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.905743    3827 request.go:629] Waited for 196.205437ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905783    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905790    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.905812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.905826    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.908144    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.106945    3827 request.go:629] Waited for 198.217758ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106991    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.107017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.107023    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.109503    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.109889    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.109898    3827 pod_ready.go:81] duration metric: took 400.399458ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.109910    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.306502    3827 request.go:629] Waited for 196.553294ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.306589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.306593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.308907    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.506077    3827 request.go:629] Waited for 196.82354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506180    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.506189    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.506195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.508341    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.508805    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.508814    3827 pod_ready.go:81] duration metric: took 398.898513ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.508829    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.706656    3827 request.go:629] Waited for 197.780207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706753    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706765    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.706776    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.706784    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.709960    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.906621    3827 request.go:629] Waited for 195.987746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906714    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906726    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.906737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.906744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.910100    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.910537    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.910550    3827 pod_ready.go:81] duration metric: took 401.715473ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.910559    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.106125    3827 request.go:629] Waited for 195.518023ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106250    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106262    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.106273    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.106280    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.109411    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.306599    3827 request.go:629] Waited for 196.360989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306730    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.306741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.306747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.309953    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.310311    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.310320    3827 pod_ready.go:81] duration metric: took 399.753992ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.310327    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.505092    3827 request.go:629] Waited for 194.718659ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505129    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.505140    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.505144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.510347    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:49.706499    3827 request.go:629] Waited for 195.722594ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706547    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706556    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.706623    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.706634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.709639    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:49.710039    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.710049    3827 pod_ready.go:81] duration metric: took 399.716837ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.710061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.906378    3827 request.go:629] Waited for 196.280735ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906418    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.906425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.906442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.911634    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:50.106586    3827 request.go:629] Waited for 194.536585ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106637    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106652    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.106717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.106725    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.109661    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.110176    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.110189    3827 pod_ready.go:81] duration metric: took 400.121095ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.110197    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.306216    3827 request.go:629] Waited for 195.968962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306286    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.306291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.306301    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.308314    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.505180    3827 request.go:629] Waited for 196.336434ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505332    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.505344    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.505351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.508601    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.509059    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.509072    3827 pod_ready.go:81] duration metric: took 398.868353ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.509081    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.705014    3827 request.go:629] Waited for 195.886159ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.705144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.705151    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.708274    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.906912    3827 request.go:629] Waited for 198.179332ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906985    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906991    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.906997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.907002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.908938    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:50.909509    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.909519    3827 pod_ready.go:81] duration metric: took 400.431581ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.909525    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.106576    3827 request.go:629] Waited for 197.012349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106668    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.106677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.106682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.109021    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.305894    3827 request.go:629] Waited for 196.495089ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.305945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.306000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.306010    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.306018    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.308864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.309301    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.309311    3827 pod_ready.go:81] duration metric: took 399.779835ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.309324    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.504969    3827 request.go:629] Waited for 195.610894ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505066    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.505072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.505076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.507056    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:51.705447    3827 request.go:629] Waited for 197.942219ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705515    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.705522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.705527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.707999    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.708367    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.708379    3827 pod_ready.go:81] duration metric: took 399.049193ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.708391    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.906469    3827 request.go:629] Waited for 198.035792ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906531    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.906539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.906545    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.909082    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.105038    3827 request.go:629] Waited for 195.597271ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105087    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105095    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.105157    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.105168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.108049    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.108591    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:52.108604    3827 pod_ready.go:81] duration metric: took 400.204131ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:52.108615    3827 pod_ready.go:38] duration metric: took 26.314911332s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:52.108628    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:52.108680    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:52.120989    3827 api_server.go:72] duration metric: took 26.533695803s to wait for apiserver process to appear ...
	I0731 10:06:52.121002    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:52.121014    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:52.124310    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:52.124340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:52.124344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.124353    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.124358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.124912    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:52.124978    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:52.124989    3827 api_server.go:131] duration metric: took 3.981645ms to wait for apiserver health ...
	I0731 10:06:52.124994    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:52.305762    3827 request.go:629] Waited for 180.72349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305845    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305853    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.305861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.305872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.310548    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:52.315274    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:52.315286    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.315289    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.315292    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.315295    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.315298    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.315301    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.315303    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.315306    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.315311    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.315313    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.315316    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.315319    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.315322    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.315327    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.315330    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.315333    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.315335    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.315338    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.315341    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.315343    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.315346    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.315348    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.315350    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.315353    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.315358    3827 system_pods.go:74] duration metric: took 190.3593ms to wait for pod list to return data ...
	I0731 10:06:52.315363    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:52.505103    3827 request.go:629] Waited for 189.702061ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505178    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505187    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.505195    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.505199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.507558    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.507636    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:52.507644    3827 default_sa.go:55] duration metric: took 192.276446ms for default service account to be created ...
	I0731 10:06:52.507666    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:52.705427    3827 request.go:629] Waited for 197.710286ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705497    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.705519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.705526    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.711904    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:06:52.716760    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:52.716772    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.716777    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.716780    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.716783    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.716787    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.716790    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.716794    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.716798    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.716801    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.716805    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.716809    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.716813    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.716816    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.716819    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.716823    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.716827    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.716830    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.716833    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.716836    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.716854    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.716860    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.716864    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.716867    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.716871    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.716876    3827 system_pods.go:126] duration metric: took 209.203713ms to wait for k8s-apps to be running ...
	I0731 10:06:52.716881    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:52.716936    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:52.731223    3827 system_svc.go:56] duration metric: took 14.33545ms WaitForService to wait for kubelet
	I0731 10:06:52.731240    3827 kubeadm.go:582] duration metric: took 27.143948309s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:52.731255    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:52.906178    3827 request.go:629] Waited for 174.879721ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906213    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906218    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.906257    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.906264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.908378    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.909014    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909025    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909032    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909035    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909039    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909041    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909045    3827 node_conditions.go:105] duration metric: took 177.780993ms to run NodePressure ...
	I0731 10:06:52.909053    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:52.909067    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:52.931184    3827 out.go:177] 
	I0731 10:06:52.952773    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:52.952858    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:52.974676    3827 out.go:177] * Starting "ha-393000-m04" worker node in "ha-393000" cluster
	I0731 10:06:53.016553    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:53.016583    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:53.016766    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:53.016784    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:53.016901    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.017869    3827 start.go:360] acquireMachinesLock for ha-393000-m04: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:53.017982    3827 start.go:364] duration metric: took 90.107µs to acquireMachinesLock for "ha-393000-m04"
	I0731 10:06:53.018005    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:53.018013    3827 fix.go:54] fixHost starting: m04
	I0731 10:06:53.018399    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:53.018423    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:53.027659    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52069
	I0731 10:06:53.028033    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:53.028349    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:53.028359    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:53.028586    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:53.028695    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.028810    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:06:53.028891    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.028978    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 10:06:53.029947    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid 3095 missing from process table
	I0731 10:06:53.029967    3827 fix.go:112] recreateIfNeeded on ha-393000-m04: state=Stopped err=<nil>
	I0731 10:06:53.029982    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	W0731 10:06:53.030076    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:53.051730    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m04" ...
	I0731 10:06:53.093566    3827 main.go:141] libmachine: (ha-393000-m04) Calling .Start
	I0731 10:06:53.093954    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.094004    3827 main.go:141] libmachine: (ha-393000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid
	I0731 10:06:53.094113    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Using UUID 8a49f5e0-ba79-41ac-9a76-c032dc065628
	I0731 10:06:53.120538    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Generated MAC d2:d8:fb:1d:1:ee
	I0731 10:06:53.120559    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:53.120750    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120805    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120864    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "8a49f5e0-ba79-41ac-9a76-c032dc065628", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:53.120909    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 8a49f5e0-ba79-41ac-9a76-c032dc065628 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:53.120925    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:53.122259    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Pid is 3870
	I0731 10:06:53.122766    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 0
	I0731 10:06:53.122781    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.122872    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3870
	I0731 10:06:53.125179    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 10:06:53.125242    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:53.125254    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:06:53.125266    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:53.125273    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:53.125280    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:53.125287    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found match: d2:d8:fb:1d:1:ee
	I0731 10:06:53.125295    3827 main.go:141] libmachine: (ha-393000-m04) DBG | IP: 192.169.0.8
	I0731 10:06:53.125358    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetConfigRaw
	I0731 10:06:53.126014    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:06:53.126188    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.126707    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:53.126722    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.126959    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:06:53.127071    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:06:53.127158    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127274    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127389    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:06:53.127538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:53.127705    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:06:53.127713    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:53.131247    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:53.140131    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:53.141373    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.141406    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.141429    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.141447    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.528683    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:53.528699    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:53.643451    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.643474    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.643483    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.643491    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.644344    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:53.644357    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:59.241509    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:59.241622    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:59.241636    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:59.265250    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:07:04.190144    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:07:04.190159    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190326    3827 buildroot.go:166] provisioning hostname "ha-393000-m04"
	I0731 10:07:04.190338    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190427    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.190528    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.190617    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190711    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190826    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.190962    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.191110    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.191119    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m04 && echo "ha-393000-m04" | sudo tee /etc/hostname
	I0731 10:07:04.259087    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m04
	
	I0731 10:07:04.259102    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.259236    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.259339    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259439    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.259647    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.259797    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.259811    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:07:04.323580    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:07:04.323604    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:07:04.323616    3827 buildroot.go:174] setting up certificates
	I0731 10:07:04.323623    3827 provision.go:84] configureAuth start
	I0731 10:07:04.323630    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.323758    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:04.323858    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.323932    3827 provision.go:143] copyHostCerts
	I0731 10:07:04.323960    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324021    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:07:04.324027    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324150    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:07:04.324352    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324397    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:07:04.324402    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324482    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:07:04.324627    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324668    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:07:04.324674    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324752    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:07:04.324900    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m04 san=[127.0.0.1 192.169.0.8 ha-393000-m04 localhost minikube]
	I0731 10:07:04.518738    3827 provision.go:177] copyRemoteCerts
	I0731 10:07:04.518793    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:07:04.518809    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.518951    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.519038    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.519124    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.519202    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:04.553750    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:07:04.553834    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:07:04.574235    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:07:04.574311    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:07:04.594359    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:07:04.594433    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:07:04.614301    3827 provision.go:87] duration metric: took 290.6663ms to configureAuth
	I0731 10:07:04.614319    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:07:04.614509    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:04.614526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:04.614676    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.614777    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.614880    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.614987    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.615110    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.615236    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.615386    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.615394    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:07:04.672493    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:07:04.672505    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:07:04.672600    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:07:04.672612    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.672752    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.672835    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.672958    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.673042    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.673159    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.673303    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.673352    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:07:04.741034    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	Environment=NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:07:04.741052    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.741187    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.741288    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741387    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741494    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.741621    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.741755    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.741771    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:07:06.325916    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:07:06.325931    3827 machine.go:97] duration metric: took 13.199216588s to provisionDockerMachine
	I0731 10:07:06.325941    3827 start.go:293] postStartSetup for "ha-393000-m04" (driver="hyperkit")
	I0731 10:07:06.325948    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:07:06.325960    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.326146    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:07:06.326163    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.326257    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.326346    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.326438    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.326522    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.369998    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:07:06.375343    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:07:06.375359    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:07:06.375470    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:07:06.375663    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:07:06.375669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:07:06.375894    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:07:06.394523    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:06.415884    3827 start.go:296] duration metric: took 89.928396ms for postStartSetup
	I0731 10:07:06.415906    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.416074    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:07:06.416088    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.416193    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.416287    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.416381    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.416451    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.451487    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:07:06.451545    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:07:06.482558    3827 fix.go:56] duration metric: took 13.464545279s for fixHost
	I0731 10:07:06.482584    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.482724    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.482806    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482891    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482992    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.483122    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:06.483263    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:06.483270    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:07:06.539713    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445626.658160546
	
	I0731 10:07:06.539725    3827 fix.go:216] guest clock: 1722445626.658160546
	I0731 10:07:06.539731    3827 fix.go:229] Guest: 2024-07-31 10:07:06.658160546 -0700 PDT Remote: 2024-07-31 10:07:06.482574 -0700 PDT m=+124.148842929 (delta=175.586546ms)
	I0731 10:07:06.539746    3827 fix.go:200] guest clock delta is within tolerance: 175.586546ms
	I0731 10:07:06.539751    3827 start.go:83] releasing machines lock for "ha-393000-m04", held for 13.521760862s
	I0731 10:07:06.539766    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.539895    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:06.564336    3827 out.go:177] * Found network options:
	I0731 10:07:06.583958    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	W0731 10:07:06.605128    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605143    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605170    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605183    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605593    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605717    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605786    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:07:06.605816    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	W0731 10:07:06.605831    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605845    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605864    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605930    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:07:06.605931    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.605944    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.606068    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606081    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.606172    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606197    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606270    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606322    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.606369    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	W0731 10:07:06.638814    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:07:06.638878    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:07:06.685734    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:07:06.685752    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.685831    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:06.701869    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:07:06.710640    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:07:06.719391    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:07:06.719452    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:07:06.728151    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.736695    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:07:06.745525    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.754024    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:07:06.762489    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:07:06.770723    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:07:06.779179    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:07:06.787524    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:07:06.795278    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:07:06.802833    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:06.908838    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:07:06.929085    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.929153    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:07:06.946994    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.958792    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:07:06.977007    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.987118    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:06.998383    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:07:07.019497    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:07.030189    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:07.045569    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:07:07.048595    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:07:07.055870    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:07:07.070037    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:07:07.166935    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:07:07.272420    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:07:07.272447    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:07:07.286182    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:07.397807    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:07:09.678871    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.281044692s)
	I0731 10:07:09.678935    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:07:09.691390    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:07:09.706154    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:09.718281    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:07:09.818061    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:07:09.918372    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.020296    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:07:10.034132    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:10.045516    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.140924    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:07:10.198542    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:07:10.198622    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:07:10.202939    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:07:10.203007    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:07:10.206254    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:07:10.238107    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:07:10.238184    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.256129    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.301307    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:07:10.337880    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:07:10.396169    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:07:10.454080    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	I0731 10:07:10.491070    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:10.491478    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:07:10.496573    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:10.506503    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:07:10.506687    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:10.506931    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.506954    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.515949    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52091
	I0731 10:07:10.516322    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.516656    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.516668    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.516893    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.517004    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:07:10.517099    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:07:10.517181    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:07:10.518192    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:07:10.518454    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.518477    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.527151    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52093
	I0731 10:07:10.527586    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.527914    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.527931    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.528158    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.528268    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:07:10.528367    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.8
	I0731 10:07:10.528374    3827 certs.go:194] generating shared ca certs ...
	I0731 10:07:10.528388    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:07:10.528576    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:07:10.528655    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:07:10.528666    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:07:10.528692    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:07:10.528712    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:07:10.528731    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:07:10.528834    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:07:10.528887    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:07:10.528897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:07:10.528933    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:07:10.528968    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:07:10.529000    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:07:10.529077    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:10.529114    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.529135    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.529152    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.529176    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:07:10.550191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:07:10.570588    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:07:10.590746    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:07:10.611034    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:07:10.631281    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:07:10.651472    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:07:10.671880    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:07:10.676790    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:07:10.685541    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689430    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689496    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.694391    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:07:10.703456    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:07:10.712113    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715795    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.720285    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:07:10.728964    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:07:10.737483    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741091    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741135    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.745570    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:07:10.754084    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:07:10.757225    3827 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 10:07:10.757258    3827 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.30.3 docker false true} ...
	I0731 10:07:10.757327    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:07:10.757375    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.764753    3827 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 10:07:10.764797    3827 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 10:07:10.772338    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 10:07:10.772398    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:07:10.772434    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772437    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.780324    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 10:07:10.780354    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 10:07:10.780356    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 10:07:10.780369    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 10:07:10.799303    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.799462    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.842469    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 10:07:10.842511    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 10:07:11.478912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0731 10:07:11.486880    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:07:11.501278    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:07:11.515550    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:07:11.518663    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:11.528373    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.625133    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:11.645175    3827 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0731 10:07:11.645375    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:11.651211    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:07:11.692705    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.797111    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:12.534860    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:07:12.535084    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:07:12.535128    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:07:12.535291    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:07:12.535335    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:12.535339    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:12.535359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:12.535366    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:12.537469    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.035600    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.035613    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.035620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.035622    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.037811    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.536601    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.536621    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.536630    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.536636    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.539103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.035926    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.035943    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.035952    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.035957    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.038327    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.535691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.535719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.538107    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.538174    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:15.035707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.035739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.037991    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:15.535587    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.535602    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.535658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.535663    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.537787    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.035475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.035497    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.035550    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.035555    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.037882    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.536666    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.536687    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.536712    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.536719    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.538904    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:17.035473    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.035488    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.035495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.035498    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.037610    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:17.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.536074    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.536089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.536096    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.539102    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.035624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.035646    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.035652    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.037956    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.535491    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.535589    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.535603    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.535610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.538819    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:18.538965    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:19.036954    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.037007    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.037028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.037033    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.039345    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:19.536847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.536862    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.536870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.536873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.538820    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.037064    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.037079    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.037086    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.037089    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.038945    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.536127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.536138    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.536145    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.536150    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:21.036613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.036684    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.036695    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.036701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.039123    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:21.039186    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:21.536684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.536700    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.536705    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.536708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.538918    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:22.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.036736    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.036743    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.036746    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.038627    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:22.536686    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.536704    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.536714    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.536718    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.538549    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:23.036470    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.036482    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.036489    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.036494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.038533    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:23.535581    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.535639    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.535653    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.535667    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.539678    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:23.539740    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:24.036874    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.036948    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.036959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.036965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.039843    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:24.536241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.536307    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.536318    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.536323    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.538807    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.036279    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.036343    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.036356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.036362    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.038454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.535942    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.535954    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.535962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.535967    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.538068    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.036823    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.036838    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.036845    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.036848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.038942    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.039008    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:26.535480    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.535499    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.535533    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:27.036202    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.036213    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.036219    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.036222    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.038071    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:27.537206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.537226    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.537236    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.537248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.539573    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.036203    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.036217    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.036223    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.036225    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.038017    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:28.536971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.536988    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.536998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.537003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.539378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.539442    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:29.035655    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.035667    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.035673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.035676    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.037786    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:29.537109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.537124    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.537144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.539430    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:30.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.035905    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.035908    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.037803    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:30.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.535701    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.535718    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.539029    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:31.036151    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.036166    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.036175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.038532    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:31.038593    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:31.536698    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.536710    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.538484    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.037162    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.037178    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.037185    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.037188    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.039081    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.536065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.536085    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.536095    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.536099    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.538365    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.036492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.036513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.036523    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.036527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.038851    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.038919    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:33.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.535576    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.537575    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:34.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.036912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.036923    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.036932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.040173    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:34.535858    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:35.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.036670    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.036677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.036682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.038861    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:35.038930    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:35.535814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.535827    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.535835    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.535840    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.538360    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.035769    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.035785    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.038202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.535426    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.535438    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.535445    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.535449    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.537303    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:37.035456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.035470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.035479    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.035483    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.037630    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.536548    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.536562    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.536568    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.536572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.538659    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.538720    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:38.036407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.036421    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.036427    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.036432    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.038467    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:38.537359    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.537378    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.537387    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.537392    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.539892    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:39.036414    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.036470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.036495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:39.535817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.535832    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.535839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.537796    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.035880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.035896    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.035906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.037712    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:40.535492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.535523    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.535536    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.538475    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:41.035745    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.035758    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.035770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.035774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:41.535726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.535738    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.535744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.535747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.537897    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.036564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.036573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.039525    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.039600    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:42.535450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.535465    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.537399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:43.035576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.035592    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.035598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.035602    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:43.536787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.536822    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.536832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.536837    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.539146    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.036148    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.036161    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.036169    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.036173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.038382    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.536653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.536709    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.538695    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:44.538753    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:45.036650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.036662    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.036668    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.036672    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.038555    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:45.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.535582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.535590    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.538335    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.035712    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.035740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.038035    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.535534    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.535557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.535564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.537974    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:47.035871    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.035887    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.035893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.035897    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.037864    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:47.037931    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:47.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.535564    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.535570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.535573    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.537590    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:48.035461    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.035531    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.035539    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.035543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.037510    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:48.536520    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.536535    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.536541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.536544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.538561    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:49.035436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.035448    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.035454    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.035458    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.037204    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.535574    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.535592    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.535595    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.537443    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.537505    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:50.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.036547    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.036566    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.038478    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:50.536624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.536636    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.536642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.538734    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.036016    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.036035    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.036044    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.036049    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.038643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.536662    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.536677    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.536686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.536691    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.539033    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.539099    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:52.036475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.036490    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.036499    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.036503    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.038975    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:52.537013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.537034    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.537041    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.537045    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.539229    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.037093    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.037106    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.037113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.037117    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.039169    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.536468    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.536478    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.536486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.539425    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.539565    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:54.035597    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.035609    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.035615    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.037574    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:54.535484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.535503    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.535509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.537529    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.036258    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.036270    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.036277    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.036280    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.038186    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:55.536493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.536513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.536533    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.539517    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.539589    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:56.035565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.035586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.035599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.040006    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:07:56.536361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.536374    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.536380    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.536383    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.538540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:57.036446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.036544    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.036567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.039754    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:57.536620    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.536630    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.536637    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.536639    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.538482    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:58.036499    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.036518    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.036527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.036532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.039244    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:58.039325    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:58.537076    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.537105    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.537197    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.537204    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.539718    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:59.037046    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.037127    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.037142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.037149    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.040197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:59.536758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.536790    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.536798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.536802    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.538842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.035440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.035453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.035460    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.035463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.037506    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.536873    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.536895    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.536906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.536913    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.540041    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:00.540123    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:01.036175    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.036225    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.036239    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.036248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.039214    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:01.535960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.535973    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.535979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.535983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.538089    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.036835    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.036856    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.036875    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.039802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.536660    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.536667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.536670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.538840    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.036159    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.036181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.036184    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.038276    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.038354    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:03.536974    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.536990    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.536996    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.537000    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.538828    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:04.036300    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.036363    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.036391    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.038707    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:04.535718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.535737    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.535749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.538366    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.036299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.036316    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.036350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.036354    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.038568    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:05.535824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.535837    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.535846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.537780    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:06.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.036592    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.036607    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.036612    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.038642    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:06.535656    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.535670    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.535679    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.535682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.538248    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.036322    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.036396    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.036407    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.036412    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.038943    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.039003    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:07.536357    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.536370    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.536379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.536384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.538778    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.036360    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.036375    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.036381    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.036384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.038393    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.536197    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.536266    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.538997    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.036883    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.036911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.036918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.036922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.039071    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.039137    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:09.535649    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.535664    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.535673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.535677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.537998    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.036229    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.036241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.036247    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.039273    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.536564    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.536575    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.536585    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.538369    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:11.036693    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.036710    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.036749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.038831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.535438    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.535452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.535461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.537490    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.537597    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:12.035786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.035805    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.035812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.035816    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.038145    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:12.536840    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.536858    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.536868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.536881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.538815    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.037034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.037049    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.037056    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.037059    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.038933    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.535502    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.535519    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.535593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.537560    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.537648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:14.036280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.036300    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.036312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.036322    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.039000    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:14.535507    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.535527    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.535537    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.538228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.036543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.036634    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.036643    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.039762    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:15.535993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.536006    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.536012    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.536015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.538186    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.538254    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:16.035582    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.035595    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.035602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:16.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.536663    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.536709    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.536713    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.538604    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:17.036351    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.036372    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.036393    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.039451    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:17.536542    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.536560    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.536573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.539454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:17.539591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:18.036512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.036578    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.036588    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.038886    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:18.535537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.535554    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.535559    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.537559    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:19.035943    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.035968    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.035980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.035987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.038665    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:19.536893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.536911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.536920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.536925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.539416    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.036463    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.036479    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.036495    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.036500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.038824    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.038907    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:20.536286    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.536306    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.536313    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.536316    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.538429    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.036034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.036055    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.038101    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.535690    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.535732    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.535740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.538264    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.036592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.036604    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.036610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.038773    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.536090    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.536103    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.536109    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.536114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.537988    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:22.538057    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:23.035526    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.035555    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.035562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.035567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.037480    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:23.536652    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.536666    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.536673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.536677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.538667    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.036746    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.036766    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.036778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.036789    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.039353    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:24.536440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.536452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.536459    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.536463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.538250    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.538315    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:25.036622    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.036643    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.036656    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.036666    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.039764    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:25.535710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.535721    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.535737    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.535742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:26.036253    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.036276    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.036338    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.036343    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.038674    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.536815    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.536828    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.536834    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.536838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.538932    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:27.035852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.035864    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.035869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.035872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.038024    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:27.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.536016    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.536028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.536036    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.539189    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:28.035934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.036002    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.036011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.036014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.037996    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:28.535538    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.535554    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.535561    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.535563    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:29.037018    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.037032    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.037039    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.037042    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.038983    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:29.039043    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:29.535757    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.535769    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.535775    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.535778    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.537697    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:30.036529    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.036548    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.036557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.038833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:30.535560    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.535570    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.535576    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.535579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.537657    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.035508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.035520    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.035527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.035531    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.037575    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.536786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.536800    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.536806    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.536809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.538674    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:31.538731    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:32.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.035833    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.035842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.035848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.038170    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:32.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.535471    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.535481    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.537802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.037123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.037156    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.037166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.037171    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.039252    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.535754    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.535760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.535763    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.537979    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.035638    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.035651    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.035658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.035661    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.037722    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:34.535808    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.535823    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.535831    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.535834    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.538223    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:35.036584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.036609    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.036620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.036625    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.039788    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:35.535720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.535732    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.535738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.535741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.537506    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:36.036439    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.036484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.036492    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.036498    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.038534    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:36.038591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:36.535446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.535465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.535467    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.537309    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:37.035737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.035776    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.035789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.035794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.037928    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:37.535410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.535422    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.535430    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.535433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.537627    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.036658    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.036738    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.036760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.039378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:38.535459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.535474    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.535490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.535494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.537817    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.036931    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.036949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.036957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.036962    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.039286    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.536472    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.536487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.536491    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.538440    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:40.036354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.036378    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.036463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.036469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.535847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.535866    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.535883    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.538740    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.538822    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:41.036206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.036229    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.036234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.038292    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:41.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.535753    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.535764    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.537837    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.036558    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.036566    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.036570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.039104    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.536474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.536484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.536491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.538339    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:43.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.035913    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.035925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.035931    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.038963    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:43.039028    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:43.537036    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.537050    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.537056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.537059    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.539282    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:44.035937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.035949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.035954    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.035958    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.037693    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:44.536399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.536470    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.536481    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.536485    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.538818    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.036937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.036960    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.036966    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.039449    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:45.535403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.535415    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.535421    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.535424    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.537208    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:46.037001    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.037088    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.037104    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.037110    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.040342    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:46.536255    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.536269    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.538801    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:47.037251    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.037286    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.037297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.037304    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.039048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.537021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.537064    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.537071    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.537076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.539084    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.539154    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:48.037354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.037369    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.037376    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.037379    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.039646    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:48.536219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.536236    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.536272    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.536276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.538242    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:49.035446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.035459    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.035465    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.035469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.037563    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:49.535517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.535540    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.537433    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:50.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.036659    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.036665    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.036670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.038735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:50.038803    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:50.535659    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.535678    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.535690    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.535697    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.538598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.036768    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.036782    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.036789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.036794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.038898    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.536592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.536608    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.536616    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.536621    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.539087    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:52.036618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.036639    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.036652    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.036658    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.039828    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:52.039911    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:52.535902    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.537950    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.036705    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.036716    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.036721    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.039002    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.535467    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.535473    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.535476    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.537615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.036291    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.036325    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.036406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.036414    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.039211    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.535751    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.535763    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.535769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.535772    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.537488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:54.537606    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:55.036966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.036982    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.036988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.038791    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:55.537260    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.537303    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.537312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.537315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.539579    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.036346    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.036359    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.036367    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.036370    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.038527    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.536015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.536055    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.536063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.536068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:56.538106    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:57.036625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.036637    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.036646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.038481    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:57.536731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.536744    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.536749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:58.037081    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.037160    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.037174    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.037182    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.040222    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:58.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.535453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.535460    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.535463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.537373    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:59.037130    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.037151    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.037161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.037181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.039237    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:59.039342    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:59.536756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.536768    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.536774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.536777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.538430    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:00.036701    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.036714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.036720    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.036723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.038842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:00.535558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.535574    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.535620    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.535625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.537993    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.036274    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.036293    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.036302    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.036305    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.038700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.536455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.536488    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.536511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.538672    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.538736    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:02.036272    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.036286    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.036291    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.036295    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:02.535392    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.535405    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.535416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.535419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.537336    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.036249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.036264    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.036271    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.036276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.038181    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.536990    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.537012    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.537020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.537024    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.541054    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:09:03.541125    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:04.036809    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.036887    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.036896    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.036902    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.039202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:04.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.537152    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.537166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.537904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.540615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.036817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.036832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.036838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.036842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.038865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.535412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.535430    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.535438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.535446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.538103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.036140    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.036160    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.036172    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.039025    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:06.536908    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.536923    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.536930    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.536933    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.538854    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:07.035951    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.035965    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.035974    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.035979    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.038105    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:07.535618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.535629    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.535635    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.535637    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.537552    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:08.036184    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.036212    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.036273    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.036279    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.038850    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.536040    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.536056    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.536065    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.536069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.538402    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.538460    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:09.036971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.037018    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.037025    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.037031    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.039100    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:09.535468    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.535480    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.535490    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.537589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.035464    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.035479    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.035491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.035506    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.037831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.536550    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.536622    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.536632    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.536638    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.539005    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.539064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:11.037316    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.037399    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.037415    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.037425    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.040113    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:11.536965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.536989    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.537033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.537044    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.539689    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:12.036399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.036469    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.036480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.036486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.038399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:12.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.535463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.535486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.539207    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:12.539333    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:13.036110    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.036220    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.036236    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:13.535970    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.535990    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.536002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.536008    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.539197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:14.037193    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.037263    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.037274    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.037286    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.039603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:14.535571    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.535594    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.036611    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.036630    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.036642    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.036648    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.039592    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.039739    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:15.535565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.535590    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.535602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.535608    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.539127    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.035884    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.035904    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.035915    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.035919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.038938    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.535882    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.535893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.535900    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.535904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.537836    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:17.036590    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.036605    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.036618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.039082    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:17.535436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.535454    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:17.539295    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:18.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.035505    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.035509    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.037946    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:18.536869    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.536890    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.538941    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:19.035847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.035859    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.035865    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.035868    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.037761    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:19.536117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.536142    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.536154    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.536160    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:19.539466    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:20.036919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.036993    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.037004    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.037009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.039230    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:20.536619    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.536716    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.536731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.536738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.539591    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.036024    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.036114    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.036129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.036136    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.038666    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.535434    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.535447    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.535453    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.535457    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.537251    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:22.037204    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.037219    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.037228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.037234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.039524    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:22.039581    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:22.536431    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.536450    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.536464    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.536473    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.539233    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.035562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.035606    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.035627    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.035634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.037971    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.536675    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.536742    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.539879    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:24.035514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.035529    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.035535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.035544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.037431    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:24.536058    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.536156    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.536171    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.536179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.538730    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:24.538810    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:25.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.036804    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.036814    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.036821    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.039117    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:25.535569    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.535587    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.535596    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.538114    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.035517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.035542    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.035556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.035562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.038485    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.536365    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.536379    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.536386    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.536390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.538690    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:27.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.036652    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.036703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.036709    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.038432    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:27.038498    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:27.535539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.535560    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.535580    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.538434    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.035626    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.035644    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.035647    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.037699    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.536177    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.536199    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.536212    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.536217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.539218    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:29.036925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.036950    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.036962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.036969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.040007    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:29.040064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:29.537194    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.537209    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.537228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.537240    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.539598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.036373    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.036494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.039302    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.536789    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.536807    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.536815    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.536820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.539885    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.036624    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.036635    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.039815    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.536285    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.536295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.536301    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.538680    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:31.538744    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:32.036451    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.036463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.036469    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.036472    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.038847    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:32.536969    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.537019    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.537032    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.537041    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.539636    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.035557    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.035573    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.035582    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.035587    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.535485    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.535509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.535522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.535529    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.538268    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.035811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.035830    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.035841    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.035846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.038580    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.038645    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:34.535515    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.535562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.537523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.036865    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.036880    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.036887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.036890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.038894    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.535476    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.535574    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.535579    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.537495    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:36.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.036227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.038994    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:36.039061    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:36.536105    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.536117    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.536124    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.536127    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.036134    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.536082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.536101    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.536110    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.536114    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.538459    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.035493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.035509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.035517    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.035524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.037791    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.535613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.535632    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.535645    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.535668    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.539185    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:38.539281    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:39.036660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.036682    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.036693    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.036700    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.039452    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:39.535986    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.536000    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.536007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.536011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.537968    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:40.036939    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.037010    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.037021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.037026    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.039435    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:40.536149    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.536171    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.536233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.536239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.538338    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.036629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.036641    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.036647    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.036651    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.038835    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.038897    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:41.536269    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.536280    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.536287    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.536290    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.538277    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:42.036495    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.036511    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.036520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.036524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.038560    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:42.537182    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.537201    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.537210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.537215    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.539833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.035857    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.035874    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.035881    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.035891    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.038530    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.536377    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.536465    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.536480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.536488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.539159    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.539217    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:44.036979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.037065    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.037081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.037089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.039312    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:44.536993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.537011    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.537018    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.537063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.539131    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.036929    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.037050    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.037064    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.039700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.537112    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.537123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.539940    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.540011    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:46.036811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.036857    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.036882    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.039540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:46.535831    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.535845    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.535852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.535856    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.538387    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:47.036117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.036128    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.036134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.036137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.037871    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:47.536504    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.536553    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.536564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.536568    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:48.036960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.036980    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.036998    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.040512    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:48.041066    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:48.535514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.535532    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.535542    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.535547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.537881    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.036133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.038899    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.536876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.536893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.536899    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.538675    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.037190    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.037204    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.037213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.037216    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.039015    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.536824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.536920    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.536935    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.536942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.539735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:50.539808    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:51.035683    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.035696    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.035702    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.035706    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.038883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:51.536861    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.536882    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.536894    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.536901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.539779    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:52.035474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.035485    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.035493    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.035499    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.037401    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:52.536642    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.536661    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.536669    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.536674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.036427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.036482    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.036487    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.038951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.039010    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:53.535427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.535450    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.537257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:54.036806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.036828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.036832    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.039021    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:54.535805    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.535897    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.535912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.535919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.538990    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:55.036521    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.036539    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.036546    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.036549    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.038766    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.536714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.536723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.536727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.539055    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.539163    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:56.035522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.035534    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.035541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.035545    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:56.535916    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.535934    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.535943    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.535949    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.538329    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:57.036391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.036406    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.036413    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.036417    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.038267    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:57.535390    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.535447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.535452    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.537243    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.036778    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.036805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.036809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.038620    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.038682    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:58.536471    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.536516    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.536532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.538643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:59.035837    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.035851    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.035858    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.035861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.037705    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:59.536730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.536832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.536848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.536854    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.539682    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.035558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.035587    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.035600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.035612    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.037523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:00.535512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.535528    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.535534    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.535537    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.537603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.537667    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:01.036888    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.036943    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.036951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.036955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.038774    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:01.535488    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.535504    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.535513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.535517    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.538017    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.036031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.036054    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.037488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:02.537218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.537285    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.537295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.537300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.539559    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.539701    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:03.036241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.036256    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.036263    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.036269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.037763    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:03.536877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.536892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.536901    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.539168    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:04.035721    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.035733    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.035739    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.035742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.037607    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:04.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.535694    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.535703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.535707    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.537920    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:05.037180    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.037195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.037201    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.037205    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:05.038947    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:05.536233    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.536248    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.536254    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.536258    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.538191    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.036830    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.036845    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.036852    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.036856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.038427    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.536722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.536735    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.536741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.536753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.538631    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.036171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.036186    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.036192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.036195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.038330    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:07.536466    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.536481    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.536488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.538446    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.538510    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:08.036787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.036832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.036853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.039084    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:08.535567    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.535582    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.535589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.535593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.537711    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.035421    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.035432    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.035438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.035442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.037921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.535887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.535904    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.535913    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.535943    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.538516    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.538592    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:10.035458    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.035469    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.035474    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.035477    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.038652    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:10.535979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.535992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.535998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.536002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.537981    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:11.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.035886    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.035897    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.035901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.038043    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:11.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.535487    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.535494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.535497    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.537395    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:12.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.036591    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.036598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.036601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.038621    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:12.038676    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:12.536927    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.536941    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.536947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.536952    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.539050    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:13.036386    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.036399    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.036428    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.036433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.038022    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:13.536356    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.536376    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.536403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.536406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.035960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.035973    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.035979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.035983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.037566    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.535889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.535909    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.535920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.535926    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:14.538873    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:15.037263    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.037278    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.037284    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.037291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.038934    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:15.535930    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.535949    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.535957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.535961    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.538412    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:16.035774    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.035790    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.035798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.035803    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.037617    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:16.536338    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.536352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.536359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.536362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.538545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.036602    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.036625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.039042    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:17.535886    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.535901    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.535907    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.535910    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.538060    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:18.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.036938    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.036947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.036950    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.038702    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:18.535556    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.535580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.535586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.537620    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.035993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.036009    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.036017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.036021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.536410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.536433    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.536444    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.536452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.539613    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:19.539694    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:20.035430    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.035445    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.035456    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.035466    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.037008    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:20.536812    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.536836    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.536849    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.536855    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.539846    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.035746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.035755    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.035761    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.037893    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.536119    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.536158    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.536173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.536181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.035742    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.035796    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.038072    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.038175    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:22.536977    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.536992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.536999    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.537002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.539319    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:23.036522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.036538    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.036544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.036547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.038326    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:23.537176    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.537194    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.537202    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.537208    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.539537    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:24.036672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.036686    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.036692    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.036696    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.038290    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:24.038347    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:24.536490    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.536508    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.536519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.536525    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.539462    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:25.036309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.036323    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.036329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.036332    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.038173    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:25.535523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.535539    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.535547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.535552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.538454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:26.035663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.035681    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.035719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.035722    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.037593    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.536821    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.536893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.538841    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.538912    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:27.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.036734    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.036740    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.036743    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.038648    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:27.537059    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.537079    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.537111    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.537116    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.539595    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:28.035398    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.035411    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.035417    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.035421    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.037116    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:28.536047    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.536115    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.536125    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.536133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.538589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.036033    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.036048    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.036055    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.036058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.038794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.038860    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:29.536173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.536187    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.536193    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.536198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.538161    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:30.036950    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.037050    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.037065    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.037072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.039996    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:30.536407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.536424    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.036484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.036581    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.036600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.039439    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:31.535848    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.535863    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.535872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.036070    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.036083    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.036092    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.036097    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.038358    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.535559    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.535583    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.535597    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.535604    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.538962    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:33.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.035880    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.035887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.035890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.038234    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:33.536345    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.536363    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.536408    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.536413    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.538408    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:33.538470    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:34.035876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.035911    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.035917    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.038813    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:34.535532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.535555    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.535599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.535611    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.036525    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.036545    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.036557    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.036565    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.039453    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.536317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.536338    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.536346    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.536351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.538546    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.538604    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:36.035614    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.035632    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.035642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.035648    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.037951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:36.535593    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.535610    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.535620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.535627    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.538091    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.035952    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.035972    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.035984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.035992    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.039078    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:37.536397    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.536416    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.536425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.536431    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.538652    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.538721    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:38.036647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.036688    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.036697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.036702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.038657    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:38.535391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.535469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.535474    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.537747    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:39.036877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.036896    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.036908    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.036916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.039937    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.537361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.537463    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.537475    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.537480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.540492    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.540575    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:40.035736    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.035759    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.035797    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.035817    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.038896    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:40.536124    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.536136    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.536142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.536147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.538082    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:41.036456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.036502    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.036513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.036519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.038631    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:41.535516    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.535529    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.535535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.035758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.035795    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.038565    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.038648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:42.536775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.536801    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.536856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.536867    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.539883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:43.036733    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.036747    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.036754    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.036758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.038792    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:43.536704    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.536719    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.536725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.536730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.538830    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.037317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.037342    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.037351    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.037356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.040355    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.040430    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:44.537337    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.537352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.537358    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.537362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.539426    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.036153    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.036187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.036193    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.039178    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.535572    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.535584    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.535596    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.537420    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:46.037146    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.037161    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.037168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.037199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.039539    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.536761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.536842    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.536857    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.536863    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.539600    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.539683    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:47.037209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.037228    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.037237    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.037243    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.039381    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:47.536097    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.536127    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.536138    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.536143    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.540045    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:48.035580    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.035598    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.035610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.037609    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:48.535945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.535960    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.535966    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.535969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.537852    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:49.036904    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.036928    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.036941    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.036946    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.039794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:49.039868    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:49.536635    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.536649    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.536699    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.536704    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.035497    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.035500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.037398    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.536222    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.536321    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.536335    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.536342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.035748    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.035813    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.035820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.536457    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.536471    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.536480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.538865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.538935    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:52.036481    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.036503    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.036583    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.036593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.039545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:52.536583    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.536620    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.536636    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.539115    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:53.037214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.037226    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.037256    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.037262    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.039257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:53.535880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.535892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.535898    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.535901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.538097    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.035680    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.035691    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.035697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.035702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.037758    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.037819    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:54.536181    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.536195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.536250    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.536256    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.538069    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:55.036750    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.036858    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.036874    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.036881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.040140    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:55.535731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.535746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.535752    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.535755    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.537710    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:56.037367    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.037382    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.037392    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.037396    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.039716    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:56.039828    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:56.535738    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.535750    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.535757    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.535760    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.537553    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:57.036797    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.036852    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.036859    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.036862    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.038921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:57.535419    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.535437    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.535452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.035459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.035475    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.035484    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.035488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.037963    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.536607    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.536625    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.536640    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.536653    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.539173    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.539233    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:59.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.035890    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.035912    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:59.535411    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.535426    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.535432    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.535434    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.537913    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.036663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.036679    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.036686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.036690    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.038915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.536586    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.536602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.536610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.536615    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.538823    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.037017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.037041    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.037053    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.037058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.039885    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.039956    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:01.537010    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.537022    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.537028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.537032    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.538870    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:02.036801    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.036819    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.036827    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.036831    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.039277    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:02.535479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.535495    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.535501    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.535505    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.537168    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:03.037023    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.037069    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.037079    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.037084    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.536060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.536073    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.536079    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.536083    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.539021    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:04.036364    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.036379    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.036390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:04.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.536251    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.536260    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.536264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.538409    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:05.035688    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.035701    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.035708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.035712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.037474    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:05.535639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.535661    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.535671    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.535676    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.538235    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.036540    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.036564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.039139    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.039201    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:06.536852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.536867    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.536875    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.536879    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.539160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:07.037400    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.037412    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.037419    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.037422    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.039316    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:07.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.535496    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.535507    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.538665    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:11:08.035588    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.035602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.035609    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.035614    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.037450    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.535606    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.535617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.535624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.535628    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.537643    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.537700    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:09.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.036549    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.036556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.036560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.038511    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:09.536726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.536794    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.536805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.536810    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.036626    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.038891    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.535919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.535991    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.536003    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.536009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.538198    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.538256    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:11.035775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.035789    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.037602    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:11.535963    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.535977    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.535984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.535988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.035422    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.035494    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.035509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.035514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.037902    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.536484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.536500    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.536506    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.536510    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.538333    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:12.538392    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:12.538407    3827 node_ready.go:38] duration metric: took 4m0.003142979s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:11:12.560167    3827 out.go:177] 
	W0731 10:11:12.580908    3827 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0731 10:11:12.580926    3827 out.go:239] * 
	W0731 10:11:12.582125    3827 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:11:12.680641    3827 out.go:177] 
	
	
	==> Docker <==
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.914226423Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928630776Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928700349Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928854780Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.929029367Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.930900389Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.930985608Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.931085246Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.931220258Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928429805Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.933866106Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.933878374Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.934267390Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953115079Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953269656Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953688559Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953968281Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:38 ha-393000 dockerd[1174]: time="2024-07-31T17:06:38.259320248Z" level=info msg="ignoring event" container=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259503796Z" level=info msg="shim disconnected" id=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 namespace=moby
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259817357Z" level=warning msg="cleaning up after shim disconnected" id=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 namespace=moby
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259827803Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937784723Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937892479Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937935988Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.938076078Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	21ff27483d07f       6e38f40d628db                                                                                         5 minutes ago       Running             storage-provisioner       2                   31c959cec2158       storage-provisioner
	7500c837dfe73       8c811b4aec35f                                                                                         6 minutes ago       Running             busybox                   1                   f5579bdb56284       busybox-fc5497c4f-b94zr
	492e11c732d18       cbb01a7bd410d                                                                                         6 minutes ago       Running             coredns                   1                   22a2f7cb99560       coredns-7db6d8ff4d-wvqjl
	26d835568c733       cbb01a7bd410d                                                                                         6 minutes ago       Running             coredns                   1                   8336e3fbaa274       coredns-7db6d8ff4d-5m8st
	193af4895baf9       6f1d07c71fa0f                                                                                         6 minutes ago       Running             kindnet-cni               1                   304fa6a12c82b       kindnet-hjm7c
	4f56054bbee16       55bb025d2cfa5                                                                                         6 minutes ago       Running             kube-proxy                1                   7e638ed37b5ca       kube-proxy-zc52f
	c2de84de71d0d       6e38f40d628db                                                                                         6 minutes ago       Exited              storage-provisioner       1                   31c959cec2158       storage-provisioner
	42b34888f43b4       76932a3b37d7e                                                                                         6 minutes ago       Running             kube-controller-manager   6                   dd7a38b9a9134       kube-controller-manager-ha-393000
	bf0af6a864492       38af8ddebf499                                                                                         7 minutes ago       Running             kube-vip                  1                   7ae512ce66d9e       kube-vip-ha-393000
	0a6a6d756b8d8       76932a3b37d7e                                                                                         7 minutes ago       Exited              kube-controller-manager   5                   dd7a38b9a9134       kube-controller-manager-ha-393000
	a34d35a3b612b       3edc18e7b7672                                                                                         7 minutes ago       Running             kube-scheduler            2                   b550834f339ce       kube-scheduler-ha-393000
	488f4fddc126e       3861cfcd7c04c                                                                                         7 minutes ago       Running             etcd                      2                   35bc88d55a5f9       etcd-ha-393000
	7e0d32286913b       1f6d574d502f3                                                                                         7 minutes ago       Running             kube-apiserver            5                   913ebe1d27d36       kube-apiserver-ha-393000
	aec44315311a1       1f6d574d502f3                                                                                         9 minutes ago       Exited              kube-apiserver            4                   194073f1c5ac9       kube-apiserver-ha-393000
	86018b08bbaa1       3861cfcd7c04c                                                                                         11 minutes ago      Exited              etcd                      1                   ba75e4f4299bf       etcd-ha-393000
	5fcb6f7d8ab78       38af8ddebf499                                                                                         11 minutes ago      Exited              kube-vip                  0                   e6198932cc027       kube-vip-ha-393000
	d088fefe5f8e3       3edc18e7b7672                                                                                         11 minutes ago      Exited              kube-scheduler            1                   f04a7ecd568d2       kube-scheduler-ha-393000
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   15 minutes ago      Exited              busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         18 minutes ago      Exited              coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         18 minutes ago      Exited              coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              18 minutes ago      Exited              kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         18 minutes ago      Exited              kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	
	
	==> coredns [26d835568c73] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45868 - 37816 "HINFO IN 2903702352377705943.3393804209116430399. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.009308312s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[336879232]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.192) (total time: 30001ms):
	Trace[336879232]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.193)
	Trace[336879232]: [30.001669762s] [30.001669762s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[792684680]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.192) (total time: 30002ms):
	Trace[792684680]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.193)
	Trace[792684680]: [30.002844954s] [30.002844954s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[252017809]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.190) (total time: 30004ms):
	Trace[252017809]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.192)
	Trace[252017809]: [30.004125023s] [30.004125023s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [492e11c732d1] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:50203 - 38178 "HINFO IN 6515882504773672893.3508195612419770899. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.008964582s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1731745039]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.202) (total time: 30000ms):
	Trace[1731745039]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.203)
	Trace[1731745039]: [30.000463s] [30.000463s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1820975691]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.203) (total time: 30000ms):
	Trace[1820975691]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.203)
	Trace[1820975691]: [30.00019609s] [30.00019609s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[58591392]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.202) (total time: 30001ms):
	Trace[58591392]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.203)
	Trace[58591392]: [30.001286385s] [30.001286385s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [feda36fb8a03] <==
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               ha-393000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:53:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:12:38 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 17:06:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-393000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 9e10f5eb61854acbaf6547934383ee12
	  System UUID:                2cfe48dd-0000-0000-9b98-537ad9823a95
	  Boot ID:                    b9343713-c701-4963-b11c-cdefca0b39ab
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-b94zr              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 coredns-7db6d8ff4d-5m8st             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     18m
	  kube-system                 coredns-7db6d8ff4d-wvqjl             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     18m
	  kube-system                 etcd-ha-393000                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         18m
	  kube-system                 kindnet-hjm7c                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      18m
	  kube-system                 kube-apiserver-ha-393000             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kube-controller-manager-ha-393000    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kube-proxy-zc52f                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kube-scheduler-ha-393000             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kube-vip-ha-393000                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m35s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 6m33s                  kube-proxy       
	  Normal  Starting                 18m                    kube-proxy       
	  Normal  NodeHasSufficientPID     18m                    kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  Starting                 18m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  18m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  18m                    kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    18m                    kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  RegisteredNode           18m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeReady                18m                    kubelet          Node ha-393000 status is now: NodeReady
	  Normal  RegisteredNode           17m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           16m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           13m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  Starting                 7m19s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  7m19s (x8 over 7m19s)  kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    7m19s (x8 over 7m19s)  kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     7m19s (x7 over 7m19s)  kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  7m19s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           6m37s                  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           6m29s                  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           6m                     node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           15s                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	
	
	Name:               ha-393000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:55:06 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:12:37 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:27 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-393000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 83c6a2bd65fe41eb8d2ed449f1d84121
	  System UUID:                7863443c-0000-0000-8e8d-bbd47bc06547
	  Boot ID:                    aad47d4e-f7f0-4bd8-87b6-edfb69496407
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-zln22                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 etcd-ha-393000-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         17m
	  kube-system                 kindnet-lcwbs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      17m
	  kube-system                 kube-apiserver-ha-393000-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-controller-manager-ha-393000-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-proxy-cf577                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-scheduler-ha-393000-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-vip-ha-393000-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 6m50s              kube-proxy       
	  Normal   Starting                 13m                kube-proxy       
	  Normal   Starting                 17m                kube-proxy       
	  Normal   NodeHasSufficientPID     17m (x7 over 17m)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  17m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  17m (x8 over 17m)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    17m (x8 over 17m)  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           17m                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           17m                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Warning  Rebooted                 13m                kubelet          Node ha-393000-m02 has been rebooted, boot id: febe9487-cc37-4f76-a943-4c3bd5898a28
	  Normal   NodeHasSufficientPID     13m                kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeHasNoDiskPressure    13m                kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   Starting                 13m                kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  13m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  13m                kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   RegisteredNode           13m                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   Starting                 7m                 kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  7m (x8 over 7m)    kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    7m (x8 over 7m)    kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     7m (x7 over 7m)    kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  7m                 kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           6m37s              node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           6m29s              node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           6m                 node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           15s                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	
	
	Name:               ha-393000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:56:18 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:12:32 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:11:31 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:11:31 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:11:31 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:11:31 +0000   Wed, 31 Jul 2024 16:56:40 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-393000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 6bd67d455470412d948a97ba6f8b8a9a
	  System UUID:                451d42a6-0000-0000-8ccb-b8851dda0594
	  Boot ID:                    0d534f8f-f62b-4786-808f-39cb1c1bf961
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-n8d7h                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 etcd-ha-393000-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         16m
	  kube-system                 kindnet-s2pv6                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      16m
	  kube-system                 kube-apiserver-ha-393000-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-controller-manager-ha-393000-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-proxy-cr9pg                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-scheduler-ha-393000-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-vip-ha-393000-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 6m12s              kube-proxy       
	  Normal   Starting                 16m                kube-proxy       
	  Normal   NodeAllocatableEnforced  16m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  16m (x8 over 16m)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    16m (x8 over 16m)  kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     16m (x7 over 16m)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           16m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           13m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           6m37s              node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           6m29s              node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   Starting                 6m16s              kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  6m16s              kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  6m16s              kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    6m16s              kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     6m16s              kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 6m16s              kubelet          Node ha-393000-m03 has been rebooted, boot id: 0d534f8f-f62b-4786-808f-39cb1c1bf961
	  Normal   RegisteredNode           6m                 node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           15s                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	
	
	Name:               ha-393000-m05
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m05
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T10_12_12_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 17:12:09 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m05
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:12:40 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:12:39 +0000   Wed, 31 Jul 2024 17:12:09 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:12:39 +0000   Wed, 31 Jul 2024 17:12:09 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:12:39 +0000   Wed, 31 Jul 2024 17:12:09 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:12:39 +0000   Wed, 31 Jul 2024 17:12:30 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.9
	  Hostname:    ha-393000-m05
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 bc282f6d719b4a90b740772af576c327
	  System UUID:                9f1942b2-0000-0000-9bc9-ed8a9a6bda42
	  Boot ID:                    eeb1a68f-a657-4b5c-998f-777ed2c95fa7
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.3.0/24
	PodCIDRs:                     10.244.3.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  kube-system                 etcd-ha-393000-m05                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         30s
	  kube-system                 kindnet-2vcxs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      32s
	  kube-system                 kube-apiserver-ha-393000-m05             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         31s
	  kube-system                 kube-controller-manager-ha-393000-m05    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         31s
	  kube-system                 kube-proxy-8vlbk                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         29s
	  kube-system                 kube-scheduler-ha-393000-m05             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         31s
	  kube-system                 kube-vip-ha-393000-m05                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 28s                kube-proxy       
	  Normal  NodeHasSufficientMemory  32s (x8 over 32s)  kubelet          Node ha-393000-m05 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    32s (x8 over 32s)  kubelet          Node ha-393000-m05 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     32s (x7 over 32s)  kubelet          Node ha-393000-m05 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  32s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           30s                node-controller  Node ha-393000-m05 event: Registered Node ha-393000-m05 in Controller
	  Normal  RegisteredNode           29s                node-controller  Node ha-393000-m05 event: Registered Node ha-393000-m05 in Controller
	  Normal  RegisteredNode           27s                node-controller  Node ha-393000-m05 event: Registered Node ha-393000-m05 in Controller
	  Normal  RegisteredNode           15s                node-controller  Node ha-393000-m05 event: Registered Node ha-393000-m05 in Controller
	
	
	==> dmesg <==
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.035849] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.008140] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.683009] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007123] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.689234] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.257015] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000008] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +2.569890] systemd-fstab-generator[473]: Ignoring "noauto" option for root device
	[  +0.101117] systemd-fstab-generator[485]: Ignoring "noauto" option for root device
	[  +1.260537] kauditd_printk_skb: 42 callbacks suppressed
	[  +0.721842] systemd-fstab-generator[1103]: Ignoring "noauto" option for root device
	[  +0.244917] systemd-fstab-generator[1140]: Ignoring "noauto" option for root device
	[  +0.105223] systemd-fstab-generator[1152]: Ignoring "noauto" option for root device
	[  +0.108861] systemd-fstab-generator[1166]: Ignoring "noauto" option for root device
	[  +2.483787] systemd-fstab-generator[1382]: Ignoring "noauto" option for root device
	[  +0.096628] systemd-fstab-generator[1394]: Ignoring "noauto" option for root device
	[  +0.110449] systemd-fstab-generator[1406]: Ignoring "noauto" option for root device
	[  +0.128159] systemd-fstab-generator[1422]: Ignoring "noauto" option for root device
	[  +0.446597] systemd-fstab-generator[1585]: Ignoring "noauto" option for root device
	[  +6.854766] kauditd_printk_skb: 271 callbacks suppressed
	[ +21.847998] kauditd_printk_skb: 40 callbacks suppressed
	[Jul31 17:06] kauditd_printk_skb: 80 callbacks suppressed
	
	
	==> etcd [488f4fddc126] <==
	{"level":"info","ts":"2024-07-31T17:06:27.348297Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:12:09.48261Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(2035864250365333051 13314548521573537860 14707668837576794450) learners=(3787017823365283298)"}
	{"level":"info","ts":"2024-07-31T17:12:09.48294Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844","added-peer-id":"348e30a557c2bde2","added-peer-peer-urls":["https://192.169.0.9:2380"]}
	{"level":"info","ts":"2024-07-31T17:12:09.482988Z","caller":"rafthttp/peer.go:133","msg":"starting remote peer","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.483013Z","caller":"rafthttp/pipeline.go:72","msg":"started HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.484207Z","caller":"rafthttp/peer.go:137","msg":"started remote peer","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.484274Z","caller":"rafthttp/transport.go:317","msg":"added remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2","remote-peer-urls":["https://192.169.0.9:2380"]}
	{"level":"info","ts":"2024-07-31T17:12:09.484339Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.484379Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.484392Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.48461Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"warn","ts":"2024-07-31T17:12:09.537095Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"348e30a557c2bde2","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"warn","ts":"2024-07-31T17:12:10.027125Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"348e30a557c2bde2","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"warn","ts":"2024-07-31T17:12:10.535309Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"348e30a557c2bde2","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-07-31T17:12:10.924451Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:10.92592Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"348e30a557c2bde2","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-07-31T17:12:10.925957Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:10.926147Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:10.927284Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"348e30a557c2bde2","stream-type":"stream Message"}
	{"level":"info","ts":"2024-07-31T17:12:10.92746Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:10.963168Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"warn","ts":"2024-07-31T17:12:11.526308Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"348e30a557c2bde2","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-07-31T17:12:12.027619Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(2035864250365333051 3787017823365283298 13314548521573537860 14707668837576794450)"}
	{"level":"info","ts":"2024-07-31T17:12:12.028082Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-07-31T17:12:12.028265Z","caller":"etcdserver/server.go:1946","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"348e30a557c2bde2"}
	
	
	==> etcd [86018b08bbaa] <==
	{"level":"info","ts":"2024-07-31T17:04:54.706821Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:54.70684Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:54.70685Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:04:55.363421Z","caller":"etcdserver/server.go:2089","msg":"failed to publish local member to cluster through raft","local-member-id":"b8c6c7563d17d844","local-member-attributes":"{Name:ha-393000 ClientURLs:[https://192.169.0.5:2379]}","request-path":"/0/members/b8c6c7563d17d844/attributes","publish-timeout":"7s","error":"etcdserver: request timed out"}
	{"level":"warn","ts":"2024-07-31T17:04:55.539618Z","caller":"etcdhttp/health.go:232","msg":"serving /health false; no leader"}
	{"level":"warn","ts":"2024-07-31T17:04:55.539664Z","caller":"etcdhttp/health.go:119","msg":"/health error","output":"{\"health\":\"false\",\"reason\":\"RAFT NO LEADER\"}","status-code":503}
	{"level":"info","ts":"2024-07-31T17:04:56.510556Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.510829Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.511027Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.51112Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.511212Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306509Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306743Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306923Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.307075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.307212Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404702Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404767Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404769Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:04:59.405991Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"info","ts":"2024-07-31T17:05:00.106932Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106958Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106967Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106977Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106982Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	
	
	==> kernel <==
	 17:12:42 up 7 min,  0 users,  load average: 0.12, 0.22, 0.11
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [193af4895baf] <==
	I0731 17:12:19.070587       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:12:19.070700       1 main.go:299] handling current node
	I0731 17:12:19.072035       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:12:19.072138       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:12:19.072294       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:12:19.072346       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:12:19.072443       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0731 17:12:19.072527       1 main.go:322] Node ha-393000-m05 has CIDR [10.244.3.0/24] 
	I0731 17:12:19.072580       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.3.0/24 Src: <nil> Gw: 192.169.0.9 Flags: [] Table: 0} 
	I0731 17:12:29.067499       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:12:29.067687       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:12:29.068024       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:12:29.068167       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:12:29.068319       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0731 17:12:29.068428       1 main.go:322] Node ha-393000-m05 has CIDR [10.244.3.0/24] 
	I0731 17:12:29.068600       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:12:29.068650       1 main.go:299] handling current node
	I0731 17:12:39.071971       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:12:39.072157       1 main.go:299] handling current node
	I0731 17:12:39.072214       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:12:39.072232       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:12:39.072412       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:12:39.072497       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:12:39.072667       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0731 17:12:39.072783       1 main.go:322] Node ha-393000-m05 has CIDR [10.244.3.0/24] 
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:59:40.110698       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:50.118349       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:50.118427       1 main.go:299] handling current node
	I0731 16:59:50.118450       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:50.118464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:50.118651       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:50.118739       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.118883       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:00.118987       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:00.119126       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:00.119236       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.119356       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:00.119483       1 main.go:299] handling current node
	I0731 17:00:10.110002       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:10.111054       1 main.go:299] handling current node
	I0731 17:00:10.111286       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:10.111319       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:10.111445       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:10.111480       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:20.116250       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:20.116442       1 main.go:299] handling current node
	I0731 17:00:20.116458       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:20.116464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:20.116608       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:20.116672       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [7e0d32286913] <==
	I0731 17:05:50.070570       1 controller.go:80] Starting OpenAPI V3 AggregationController
	I0731 17:05:50.074783       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0731 17:05:50.074947       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:05:50.086677       1 crdregistration_controller.go:111] Starting crd-autoregister controller
	I0731 17:05:50.086708       1 shared_informer.go:313] Waiting for caches to sync for crd-autoregister
	I0731 17:05:50.117864       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0731 17:05:50.122120       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:05:50.122365       1 policy_source.go:224] refreshing policies
	I0731 17:05:50.132563       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0731 17:05:50.166384       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0731 17:05:50.168074       1 shared_informer.go:320] Caches are synced for configmaps
	I0731 17:05:50.168116       1 apf_controller.go:379] Running API Priority and Fairness config worker
	I0731 17:05:50.168122       1 apf_controller.go:382] Running API Priority and Fairness periodic rebalancing process
	I0731 17:05:50.170411       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0731 17:05:50.174248       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0731 17:05:50.178334       1 handler_discovery.go:447] Starting ResourceDiscoveryManager
	I0731 17:05:50.187980       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0731 17:05:50.188024       1 aggregator.go:165] initial CRD sync complete...
	I0731 17:05:50.188030       1 autoregister_controller.go:141] Starting autoregister controller
	I0731 17:05:50.188034       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0731 17:05:50.188038       1 cache.go:39] Caches are synced for autoregister controller
	E0731 17:05:50.205462       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0731 17:05:51.075340       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0731 17:06:47.219071       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0731 17:07:08.422863       1 controller.go:615] quota admission added evaluator for: endpoints
	
	
	==> kube-apiserver [aec44315311a] <==
	I0731 17:03:27.253147       1 options.go:221] external host was not specified, using 192.169.0.5
	I0731 17:03:27.253888       1 server.go:148] Version: v1.30.3
	I0731 17:03:27.253988       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:03:27.786353       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0731 17:03:27.788898       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:03:27.790619       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0731 17:03:27.790629       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0731 17:03:27.790778       1 instance.go:299] Using reconciler: lease
	W0731 17:03:47.786207       1 logging.go:59] [core] [Channel #1 SubChannel #3] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0731 17:03:47.786314       1 logging.go:59] [core] [Channel #2 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0731 17:03:47.791937       1 instance.go:292] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-controller-manager [0a6a6d756b8d] <==
	I0731 17:05:30.561595       1 serving.go:380] Generated self-signed cert in-memory
	I0731 17:05:31.250391       1 controllermanager.go:189] "Starting" version="v1.30.3"
	I0731 17:05:31.250471       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:05:31.252077       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0731 17:05:31.252281       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:05:31.252444       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:05:31.254793       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0731 17:05:51.257636       1 controllermanager.go:234] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: an error on the server (\"[+]ping ok\\n[+]log ok\\n[+]etcd ok\\n[+]poststarthook/start-apiserver-admission-initializer ok\\n[+]poststarthook/generic-apiserver-start-informers ok\\n[+]poststarthook/priority-and-fairness-config-consumer ok\\n[+]poststarthook/priority-and-fairness-filter ok\\n[+]poststarthook/storage-object-count-tracker-hook ok\\n[+]poststarthook/start-apiextensions-informers ok\\n[+]poststarthook/start-apiextensions-controllers ok\\n[+]poststarthook/crd-informer-synced ok\\n[+]poststarthook/start-service-ip-repair-controllers ok\\n[-]poststarthook/rbac/bootstrap-roles failed: reason withheld\\n[+]poststarthook/scheduling/bootstrap-system-priority-classes ok\\n[+]poststarthook/priority-and-fairness-config-producer ok\\n[+]poststarthook/start-system-namespaces-controller
ok\\n[+]poststarthook/bootstrap-controller ok\\n[+]poststarthook/start-cluster-authentication-info-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok\\n[+]poststarthook/start-legacy-token-tracking-controller ok\\n[+]poststarthook/aggregator-reload-proxy-client-cert ok\\n[+]poststarthook/start-kube-aggregator-informers ok\\n[+]poststarthook/apiservice-registration-controller ok\\n[+]poststarthook/apiservice-status-available-controller ok\\n[+]poststarthook/apiservice-discovery-controller ok\\n[+]poststarthook/kube-apiserver-autoregistration ok\\n[+]autoregister-completion ok\\n[+]poststarthook/apiservice-openapi-controller ok\\n[+]poststarthook/apiservice-openapiv3-controller ok\\nhealthz check failed\") has prevented the request from succeeding"
	
	
	==> kube-controller-manager [42b34888f43b] <==
	I0731 17:06:12.920443       1 shared_informer.go:320] Caches are synced for endpoint_slice
	I0731 17:06:12.952902       1 shared_informer.go:320] Caches are synced for endpoint_slice_mirroring
	I0731 17:06:12.964558       1 shared_informer.go:320] Caches are synced for resource quota
	I0731 17:06:13.012295       1 shared_informer.go:320] Caches are synced for resource quota
	I0731 17:06:13.022225       1 shared_informer.go:320] Caches are synced for ClusterRoleAggregator
	I0731 17:06:13.501091       1 shared_informer.go:320] Caches are synced for garbage collector
	I0731 17:06:13.558892       1 shared_informer.go:320] Caches are synced for garbage collector
	I0731 17:06:13.559095       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0731 17:06:26.973668       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="55.100255ms"
	I0731 17:06:26.975840       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="54.971µs"
	I0731 17:06:29.221856       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="14.898144ms"
	I0731 17:06:29.222046       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="66.05µs"
	I0731 17:06:47.214265       1 endpointslice_controller.go:311] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mvtct\": the object has been modified; please apply your changes to the latest version and try again"
	I0731 17:06:47.214807       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"a0a921f4-5219-42ca-94c6-a4038d9ff710", APIVersion:"v1", ResourceVersion:"259", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mvtct": the object has been modified; please apply your changes to the latest version and try again
	I0731 17:06:47.241205       1 endpointslice_controller.go:311] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mvtct\": the object has been modified; please apply your changes to the latest version and try again"
	I0731 17:06:47.241526       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="78.18352ms"
	I0731 17:06:47.241539       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"a0a921f4-5219-42ca-94c6-a4038d9ff710", APIVersion:"v1", ResourceVersion:"259", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mvtct": the object has been modified; please apply your changes to the latest version and try again
	E0731 17:06:47.241671       1 replica_set.go:557] sync "kube-system/coredns-7db6d8ff4d" failed with Operation cannot be fulfilled on replicasets.apps "coredns-7db6d8ff4d": the object has been modified; please apply your changes to the latest version and try again
	I0731 17:06:47.242012       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="316.596µs"
	I0731 17:06:47.246958       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="100.8µs"
	I0731 17:06:47.288893       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="32.237881ms"
	I0731 17:06:47.289070       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.102µs"
	I0731 17:12:09.257842       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m05\" does not exist"
	I0731 17:12:09.279224       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m05" podCIDRs=["10.244.3.0/24"]
	I0731 17:12:12.913824       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m05"
	
	
	==> kube-proxy [4f56054bbee1] <==
	I0731 17:06:08.426782       1 server_linux.go:69] "Using iptables proxy"
	I0731 17:06:08.446564       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 17:06:08.497695       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 17:06:08.497829       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 17:06:08.497985       1 server_linux.go:165] "Using iptables Proxier"
	I0731 17:06:08.502095       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 17:06:08.503040       1 server.go:872] "Version info" version="v1.30.3"
	I0731 17:06:08.503116       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:06:08.506909       1 config.go:192] "Starting service config controller"
	I0731 17:06:08.507443       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 17:06:08.507578       1 config.go:319] "Starting node config controller"
	I0731 17:06:08.507600       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 17:06:08.509126       1 config.go:101] "Starting endpoint slice config controller"
	I0731 17:06:08.509154       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 17:06:08.607797       1 shared_informer.go:320] Caches are synced for node config
	I0731 17:06:08.607880       1 shared_informer.go:320] Caches are synced for service config
	I0731 17:06:08.610417       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [a34d35a3b612] <==
	I0731 17:12:09.314226       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-vp778" node="ha-393000-m05"
	E0731 17:12:09.315516       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-bc6mj\": pod kindnet-bc6mj is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kindnet-bc6mj" node="ha-393000-m05"
	E0731 17:12:09.317311       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-bc6mj\": pod kindnet-bc6mj is already assigned to node \"ha-393000-m05\"" pod="kube-system/kindnet-bc6mj"
	I0731 17:12:09.317534       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-bc6mj" node="ha-393000-m05"
	E0731 17:12:09.329195       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-2vcxs\": pod kindnet-2vcxs is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kindnet-2vcxs" node="ha-393000-m05"
	E0731 17:12:09.329378       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-2vcxs\": pod kindnet-2vcxs is already assigned to node \"ha-393000-m05\"" pod="kube-system/kindnet-2vcxs"
	I0731 17:12:09.329590       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-2vcxs" node="ha-393000-m05"
	E0731 17:12:09.334254       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-rj84b\": pod kube-proxy-rj84b is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-rj84b" node="ha-393000-m05"
	E0731 17:12:09.334331       1 schedule_one.go:338] "scheduler cache ForgetPod failed" err="pod 16656a49-d07d-4c64-b134-5205c00964dd(kube-system/kube-proxy-rj84b) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-rj84b"
	E0731 17:12:09.334345       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-rj84b\": pod kube-proxy-rj84b is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-rj84b"
	I0731 17:12:09.334357       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-rj84b" node="ha-393000-m05"
	E0731 17:12:09.371198       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-kd8cb\": pod kube-proxy-kd8cb is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-kd8cb" node="ha-393000-m05"
	E0731 17:12:09.371261       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-kd8cb\": pod kube-proxy-kd8cb is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-kd8cb"
	E0731 17:12:10.476375       1 schedule_one.go:942] "Scheduler cache AssumePod failed" err="pod 5d11dce5-e8d2-44b5-9253-a247d8fdc231(kube-system/kube-proxy-kd8cb) is in the cache, so can't be assumed" pod="kube-system/kube-proxy-kd8cb"
	E0731 17:12:10.476672       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="pod 5d11dce5-e8d2-44b5-9253-a247d8fdc231(kube-system/kube-proxy-kd8cb) is in the cache, so can't be assumed" pod="kube-system/kube-proxy-kd8cb"
	I0731 17:12:10.476834       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-kd8cb" node="ha-393000-m05"
	E0731 17:12:10.795259       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-kdqqk\": pod kube-proxy-kdqqk is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-kdqqk" node="ha-393000-m05"
	E0731 17:12:10.796390       1 schedule_one.go:338] "scheduler cache ForgetPod failed" err="pod cfa57630-31cd-4723-af07-bedc5ba840bb(kube-system/kube-proxy-kdqqk) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-kdqqk"
	E0731 17:12:10.797071       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-kdqqk\": pod kube-proxy-kdqqk is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-kdqqk"
	I0731 17:12:10.797784       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-kdqqk" node="ha-393000-m05"
	E0731 17:12:12.552266       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-z758q\": pod kube-proxy-z758q is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-z758q" node="ha-393000-m05"
	E0731 17:12:12.552426       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-z758q\": pod kube-proxy-z758q is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-z758q"
	E0731 17:12:12.552850       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-8vlbk\": pod kube-proxy-8vlbk is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-8vlbk" node="ha-393000-m05"
	E0731 17:12:12.553110       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-8vlbk\": pod kube-proxy-8vlbk is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-8vlbk"
	I0731 17:12:12.553266       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-8vlbk" node="ha-393000-m05"
	
	
	==> kube-scheduler [d088fefe5f8e] <==
	E0731 17:04:26.658553       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:28.887716       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:28.887806       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:32.427417       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:32.427586       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:36.436787       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:36.436870       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:40.022061       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:40.022227       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:43.471012       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:43.471291       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:43.930296       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:43.930321       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:44.041999       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:44.042358       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:48.230649       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:48.230983       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:58.373439       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:58.373554       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:05:00.249019       1 server.go:214] "waiting for handlers to sync" err="context canceled"
	I0731 17:05:00.249450       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	I0731 17:05:00.249577       1 secure_serving.go:258] Stopped listening on 127.0.0.1:10259
	E0731 17:05:00.249641       1 shared_informer.go:316] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0731 17:05:00.249670       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	E0731 17:05:00.249984       1 run.go:74] "command failed" err="finished without leader elect"
	
	
	==> kubelet <==
	Jul 31 17:08:22 ha-393000 kubelet[1592]: E0731 17:08:22.903462    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:08:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:09:22 ha-393000 kubelet[1592]: E0731 17:09:22.903125    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:09:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:10:22 ha-393000 kubelet[1592]: E0731 17:10:22.903858    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:10:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:11:22 ha-393000 kubelet[1592]: E0731 17:11:22.902625    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:11:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:11:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:11:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:11:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:12:22 ha-393000 kubelet[1592]: E0731 17:12:22.904610    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:12:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:12:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:12:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:12:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:261: (dbg) Run:  kubectl --context ha-393000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/AddSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/AddSecondaryNode (81.72s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (4.88s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd
ha_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
ha_test.go:304: expected profile "ha-393000" in json of 'profile list' to include 4 nodes but have 5 nodes. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-393000\",\"Status\":\"HAppy\",\"Config\":{\"Name\":\"ha-393000\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"hyperkit\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACount\":1,\"APIServerP
ort\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.3\",\"ClusterName\":\"ha-393000\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.169.0.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"docker\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.169.0.5\",\"Port\":8443,\"KubernetesVersion\":
\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.169.0.6\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.169.0.7\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m04\",\"IP\":\"192.169.0.8\",\"Port\":0,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":false,\"Worker\":true},{\"Name\":\"m05\",\"IP\":\"192.169.0.9\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.3\",\"ContainerRuntime\":\"\",\"ControlPlane\":true,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"i
ngress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":false,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-device-plugin\":false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608
000000000000,\"Mount\":false,\"MountString\":\"/Users:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOptimizations\":false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-darwin-amd64 profile list --output json"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-393000 -n ha-393000
helpers_test.go:244: <<< TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p ha-393000 logs -n 25: (3.468071183s)
helpers_test.go:252: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- get pods -o          | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-b94zr -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-n8d7h -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-393000 -- exec                 | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT | 31 Jul 24 09:56 PDT |
	|         | busybox-fc5497c4f-zln22 -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.169.0.1             |           |         |         |                     |                     |
	| node    | add -p ha-393000 -v=7                | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:56 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node stop m02 -v=7         | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:58 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-393000 node start m02 -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 09:58 PDT | 31 Jul 24 09:59 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000 -v=7               | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-393000 -v=7                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT | 31 Jul 24 10:00 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true -v=7        | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:00 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-393000                    | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	| node    | ha-393000 node delete m03 -v=7       | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | ha-393000 stop -v=7                  | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:02 PDT | 31 Jul 24 10:05 PDT |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-393000 --wait=true             | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:05 PDT |                     |
	|         | -v=7 --alsologtostderr               |           |         |         |                     |                     |
	|         | --driver=hyperkit                    |           |         |         |                     |                     |
	| node    | add -p ha-393000                     | ha-393000 | jenkins | v1.33.1 | 31 Jul 24 10:11 PDT | 31 Jul 24 10:12 PDT |
	|         | --control-plane -v=7                 |           |         |         |                     |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 10:05:02
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 10:05:02.368405    3827 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:05:02.368654    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368660    3827 out.go:304] Setting ErrFile to fd 2...
	I0731 10:05:02.368664    3827 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:05:02.368853    3827 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:05:02.370244    3827 out.go:298] Setting JSON to false
	I0731 10:05:02.392379    3827 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":2072,"bootTime":1722443430,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 10:05:02.392490    3827 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 10:05:02.414739    3827 out.go:177] * [ha-393000] minikube v1.33.1 on Darwin 14.5
	I0731 10:05:02.457388    3827 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 10:05:02.457417    3827 notify.go:220] Checking for updates...
	I0731 10:05:02.499271    3827 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:02.520330    3827 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 10:05:02.541352    3827 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 10:05:02.562183    3827 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 10:05:02.583467    3827 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 10:05:02.605150    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:02.605829    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.605892    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.615374    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51985
	I0731 10:05:02.615746    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.616162    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.616171    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.616434    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.616563    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.616815    3827 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 10:05:02.617053    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.617075    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.625506    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51987
	I0731 10:05:02.625873    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.626205    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.626218    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.626409    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.626526    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.655330    3827 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 10:05:02.697472    3827 start.go:297] selected driver: hyperkit
	I0731 10:05:02.697517    3827 start.go:901] validating driver "hyperkit" against &{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclas
s:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersio
n:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.697705    3827 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 10:05:02.697830    3827 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.698011    3827 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 10:05:02.707355    3827 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 10:05:02.711327    3827 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.711347    3827 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 10:05:02.714056    3827 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:05:02.714115    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:02.714124    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:02.714208    3827 start.go:340] cluster config:
	{Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.16
9.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-t
iller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0
MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:02.714310    3827 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 10:05:02.756588    3827 out.go:177] * Starting "ha-393000" primary control-plane node in "ha-393000" cluster
	I0731 10:05:02.778505    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:02.778576    3827 preload.go:146] Found local preload: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 10:05:02.778606    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:02.778797    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:02.778816    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:02.779007    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.779936    3827 start.go:360] acquireMachinesLock for ha-393000: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:02.780056    3827 start.go:364] duration metric: took 96.562µs to acquireMachinesLock for "ha-393000"
	I0731 10:05:02.780090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:02.780107    3827 fix.go:54] fixHost starting: 
	I0731 10:05:02.780518    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:02.780547    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:02.789537    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51989
	I0731 10:05:02.789941    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:02.790346    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:02.790360    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:02.790582    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:02.790683    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.790784    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:02.790882    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.790960    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3685
	I0731 10:05:02.791917    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid 3685 missing from process table
	I0731 10:05:02.791950    3827 fix.go:112] recreateIfNeeded on ha-393000: state=Stopped err=<nil>
	I0731 10:05:02.791969    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	W0731 10:05:02.792054    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:02.834448    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000" ...
	I0731 10:05:02.857592    3827 main.go:141] libmachine: (ha-393000) Calling .Start
	I0731 10:05:02.857865    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.857903    3827 main.go:141] libmachine: (ha-393000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid
	I0731 10:05:02.857999    3827 main.go:141] libmachine: (ha-393000) DBG | Using UUID 2cfe910a-b3bc-48dd-9b98-537ad9823a95
	I0731 10:05:02.972788    3827 main.go:141] libmachine: (ha-393000) DBG | Generated MAC 9e:7:8b:23:9c:e3
	I0731 10:05:02.972822    3827 main.go:141] libmachine: (ha-393000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:02.973002    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973031    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"2cfe910a-b3bc-48dd-9b98-537ad9823a95", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002e0840)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:02.973095    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "2cfe910a-b3bc-48dd-9b98-537ad9823a95", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=s
erial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:02.973143    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 2cfe910a-b3bc-48dd-9b98-537ad9823a95 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/ha-393000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset
norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:02.973162    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:02.974700    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 DEBUG: hyperkit: Pid is 3840
	I0731 10:05:02.975089    3827 main.go:141] libmachine: (ha-393000) DBG | Attempt 0
	I0731 10:05:02.975104    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:02.975174    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:02.977183    3827 main.go:141] libmachine: (ha-393000) DBG | Searching for 9e:7:8b:23:9c:e3 in /var/db/dhcpd_leases ...
	I0731 10:05:02.977235    3827 main.go:141] libmachine: (ha-393000) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:02.977252    3827 main.go:141] libmachine: (ha-393000) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66aa6ebd}
	I0731 10:05:02.977264    3827 main.go:141] libmachine: (ha-393000) DBG | Found match: 9e:7:8b:23:9c:e3
	I0731 10:05:02.977271    3827 main.go:141] libmachine: (ha-393000) DBG | IP: 192.169.0.5
	I0731 10:05:02.977358    3827 main.go:141] libmachine: (ha-393000) Calling .GetConfigRaw
	I0731 10:05:02.978043    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:02.978221    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:02.978639    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:02.978649    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:02.978783    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:02.978867    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:02.978959    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979081    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:02.979169    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:02.979279    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:02.979484    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:02.979495    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:02.982358    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:02 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:03.035630    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:03.036351    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.036364    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.036371    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.036377    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.417037    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:03.417051    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:03.531673    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:03.531715    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:03.531732    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:03.531747    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:03.532606    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:03.532629    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:03 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:09.110387    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:09.110442    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:09.110451    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:09.135557    3827 main.go:141] libmachine: (ha-393000) DBG | 2024/07/31 10:05:09 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:12.964386    3827 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.5:22: connect: connection refused
	I0731 10:05:16.034604    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:16.034620    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034750    3827 buildroot.go:166] provisioning hostname "ha-393000"
	I0731 10:05:16.034759    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.034882    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.034984    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.035084    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035183    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.035281    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.035421    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.035570    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.035579    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000 && echo "ha-393000" | sudo tee /etc/hostname
	I0731 10:05:16.113215    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000
	
	I0731 10:05:16.113236    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.113381    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.113518    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113636    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.113755    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.113885    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.114075    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.114086    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:16.184090    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:16.184121    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:16.184150    3827 buildroot.go:174] setting up certificates
	I0731 10:05:16.184163    3827 provision.go:84] configureAuth start
	I0731 10:05:16.184170    3827 main.go:141] libmachine: (ha-393000) Calling .GetMachineName
	I0731 10:05:16.184309    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:16.184430    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.184520    3827 provision.go:143] copyHostCerts
	I0731 10:05:16.184558    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184631    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:16.184638    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:16.184770    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:16.184969    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185016    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:16.185020    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:16.185099    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:16.185248    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185290    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:16.185295    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:16.185376    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:16.185533    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000 san=[127.0.0.1 192.169.0.5 ha-393000 localhost minikube]
	I0731 10:05:16.315363    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:16.315421    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:16.315435    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.315558    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.315655    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.315746    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.315837    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:16.355172    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:16.355248    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:16.374013    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:16.374082    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0731 10:05:16.392556    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:16.392614    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:16.411702    3827 provision.go:87] duration metric: took 227.524882ms to configureAuth
	I0731 10:05:16.411715    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:16.411879    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:16.411893    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:16.412059    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.412155    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.412231    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412316    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.412388    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.412496    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.412621    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.412628    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:16.477022    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:16.477033    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:16.477102    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:16.477118    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.477251    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.477356    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477432    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.477517    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.477641    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.477778    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.477823    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:16.554633    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:16.554652    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:16.554788    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:16.554883    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.554976    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:16.555060    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:16.555183    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:16.555333    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:16.555346    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:18.220571    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:18.220585    3827 machine.go:97] duration metric: took 15.241941013s to provisionDockerMachine
	I0731 10:05:18.220598    3827 start.go:293] postStartSetup for "ha-393000" (driver="hyperkit")
	I0731 10:05:18.220606    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:18.220616    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.220842    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:18.220863    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.220962    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.221049    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.221130    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.221229    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.266644    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:18.270380    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:18.270395    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:18.270494    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:18.270687    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:18.270693    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:18.270912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:18.279363    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:18.313374    3827 start.go:296] duration metric: took 92.765768ms for postStartSetup
	I0731 10:05:18.313403    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.313592    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:18.313611    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.313704    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.313791    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.313881    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.313968    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.352727    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:18.352783    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:18.406781    3827 fix.go:56] duration metric: took 15.626681307s for fixHost
	I0731 10:05:18.406809    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.406951    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.407051    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407152    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.407242    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.407364    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:18.407503    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.5 22 <nil> <nil>}
	I0731 10:05:18.407510    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:05:18.475125    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445518.591979627
	
	I0731 10:05:18.475138    3827 fix.go:216] guest clock: 1722445518.591979627
	I0731 10:05:18.475144    3827 fix.go:229] Guest: 2024-07-31 10:05:18.591979627 -0700 PDT Remote: 2024-07-31 10:05:18.406799 -0700 PDT m=+16.073052664 (delta=185.180627ms)
	I0731 10:05:18.475163    3827 fix.go:200] guest clock delta is within tolerance: 185.180627ms
	I0731 10:05:18.475167    3827 start.go:83] releasing machines lock for "ha-393000", held for 15.69510158s
	I0731 10:05:18.475186    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475358    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:18.475493    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.475894    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476002    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:18.476070    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:18.476101    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476134    3827 ssh_runner.go:195] Run: cat /version.json
	I0731 10:05:18.476146    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:18.476186    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476210    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:18.476297    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476335    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:18.476385    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476425    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:18.476484    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.476507    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:18.560719    3827 ssh_runner.go:195] Run: systemctl --version
	I0731 10:05:18.565831    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0731 10:05:18.570081    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:18.570125    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:18.582480    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:18.582493    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.582597    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.598651    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:18.607729    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:18.616451    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:18.616493    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:18.625351    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.634238    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:18.643004    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:18.651930    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:18.660791    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:18.669545    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:18.678319    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:18.687162    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:18.695297    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:18.703279    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:18.796523    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:18.814363    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:18.814439    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:18.827366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.839312    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:18.855005    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:18.866218    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.877621    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:18.902460    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:18.913828    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:18.928675    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:18.931574    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:18.939501    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:18.952896    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:19.047239    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:19.144409    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:19.144484    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:19.159518    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:19.256187    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:21.607075    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.350869373s)
	I0731 10:05:21.607140    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:21.618076    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:21.632059    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.642878    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:21.739846    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:21.840486    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:21.956403    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:21.971397    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:21.982152    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.074600    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:22.139737    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:22.139811    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:22.144307    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:22.144354    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:22.147388    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:22.177098    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:22.177167    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.195025    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:22.255648    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:22.255698    3827 main.go:141] libmachine: (ha-393000) Calling .GetIP
	I0731 10:05:22.256066    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:22.260342    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.270020    3827 kubeadm.go:883] updating cluster {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.
3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false f
reshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID
:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0731 10:05:22.270145    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:22.270198    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.283427    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.283451    3827 docker.go:615] Images already preloaded, skipping extraction
	I0731 10:05:22.283523    3827 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0731 10:05:22.296364    3827 docker.go:685] Got preloaded images: -- stdout --
	kindest/kindnetd:v20240719-e7903573
	registry.k8s.io/kube-apiserver:v1.30.3
	registry.k8s.io/kube-scheduler:v1.30.3
	registry.k8s.io/kube-controller-manager:v1.30.3
	registry.k8s.io/kube-proxy:v1.30.3
	ghcr.io/kube-vip/kube-vip:v0.8.0
	registry.k8s.io/etcd:3.5.12-0
	registry.k8s.io/coredns/coredns:v1.11.1
	registry.k8s.io/pause:3.9
	gcr.io/k8s-minikube/storage-provisioner:v5
	gcr.io/k8s-minikube/busybox:1.28
	
	-- /stdout --
	I0731 10:05:22.296384    3827 cache_images.go:84] Images are preloaded, skipping loading
	I0731 10:05:22.296395    3827 kubeadm.go:934] updating node { 192.169.0.5 8443 v1.30.3 docker true true} ...
	I0731 10:05:22.296485    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:22.296554    3827 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0731 10:05:22.333611    3827 cni.go:84] Creating CNI manager for ""
	I0731 10:05:22.333625    3827 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0731 10:05:22.333642    3827 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0731 10:05:22.333657    3827 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.169.0.5 APIServerPort:8443 KubernetesVersion:v1.30.3 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-393000 NodeName:ha-393000 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.169.0.5"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.169.0.5 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manif
ests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///var/run/cri-dockerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0731 10:05:22.333735    3827 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.169.0.5
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///var/run/cri-dockerd.sock
	  name: "ha-393000"
	  kubeletExtraArgs:
	    node-ip: 192.169.0.5
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.169.0.5"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.3
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///var/run/cri-dockerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0731 10:05:22.333754    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:22.333805    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:22.346453    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:22.346520    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:22.346575    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:22.354547    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:22.354585    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0731 10:05:22.361938    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (307 bytes)
	I0731 10:05:22.375252    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:22.388755    3827 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2148 bytes)
	I0731 10:05:22.402335    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:22.415747    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:22.418701    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:22.428772    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:22.517473    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:22.532209    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.5
	I0731 10:05:22.532222    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:22.532233    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:22.532416    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:22.532495    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:22.532505    3827 certs.go:256] generating profile certs ...
	I0731 10:05:22.532617    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:22.532703    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.b87e2e7e
	I0731 10:05:22.532784    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:22.532791    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:22.532813    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:22.532832    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:22.532850    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:22.532866    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:22.532896    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:22.532925    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:22.532949    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:22.533054    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:22.533101    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:22.533110    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:22.533142    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:22.533177    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:22.533206    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:22.533274    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:22.533306    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.533327    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.533344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.533765    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:22.562933    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:22.585645    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:22.608214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:22.634417    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:22.664309    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:22.693214    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:22.749172    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:22.798119    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:22.837848    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:22.862351    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:22.887141    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0731 10:05:22.900789    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:22.904988    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:22.914154    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917542    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.917577    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:22.921712    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:22.930986    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:22.940208    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943536    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.943573    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:22.947845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:22.957024    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:22.965988    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969319    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.969351    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:22.973794    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:22.982944    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:22.986290    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:22.990544    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:22.994707    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:22.999035    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:23.003364    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:23.007486    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:23.011657    3827 kubeadm.go:392] StartCluster: {Name:ha-393000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:2200 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 C
lusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true} {Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false fres
hpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:do
cker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 10:05:23.011769    3827 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0731 10:05:23.024287    3827 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0731 10:05:23.032627    3827 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0731 10:05:23.032639    3827 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0731 10:05:23.032681    3827 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0731 10:05:23.040731    3827 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:05:23.041056    3827 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-393000" does not appear in /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.041141    3827 kubeconfig.go:62] /Users/jenkins/minikube-integration/19349-1046/kubeconfig needs updating (will repair): [kubeconfig missing "ha-393000" cluster setting kubeconfig missing "ha-393000" context setting]
	I0731 10:05:23.041332    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.041968    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.042168    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.5:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)},
UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0731 10:05:23.042482    3827 cert_rotation.go:137] Starting client certificate rotation controller
	I0731 10:05:23.042638    3827 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0731 10:05:23.050561    3827 kubeadm.go:630] The running cluster does not require reconfiguration: 192.169.0.5
	I0731 10:05:23.050575    3827 kubeadm.go:597] duration metric: took 17.931942ms to restartPrimaryControlPlane
	I0731 10:05:23.050580    3827 kubeadm.go:394] duration metric: took 38.928464ms to StartCluster
	I0731 10:05:23.050588    3827 settings.go:142] acquiring lock: {Name:mk525bf71ea24469d1f77eee139c72001194c08b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.050661    3827 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:23.051035    3827 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/19349-1046/kubeconfig: {Name:mk492ed063b37397828d5838cb4d258d4e388ff3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:23.051268    3827 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.169.0.5 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:23.051280    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:05:23.051290    3827 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0731 10:05:23.051393    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.095938    3827 out.go:177] * Enabled addons: 
	I0731 10:05:23.116914    3827 addons.go:510] duration metric: took 65.60253ms for enable addons: enabled=[]
	I0731 10:05:23.116954    3827 start.go:246] waiting for cluster config update ...
	I0731 10:05:23.116965    3827 start.go:255] writing updated cluster config ...
	I0731 10:05:23.138605    3827 out.go:177] 
	I0731 10:05:23.160466    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:23.160597    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.182983    3827 out.go:177] * Starting "ha-393000-m02" control-plane node in "ha-393000" cluster
	I0731 10:05:23.224869    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:05:23.224904    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:05:23.225104    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:05:23.225125    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:05:23.225250    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.226256    3827 start.go:360] acquireMachinesLock for ha-393000-m02: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:05:23.226360    3827 start.go:364] duration metric: took 80.549µs to acquireMachinesLock for "ha-393000-m02"
	I0731 10:05:23.226385    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:05:23.226394    3827 fix.go:54] fixHost starting: m02
	I0731 10:05:23.226804    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:23.226838    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:23.236394    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52012
	I0731 10:05:23.236756    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:23.237106    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:23.237125    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:23.237342    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:23.237473    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.237574    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetState
	I0731 10:05:23.237669    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.237738    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3703
	I0731 10:05:23.238671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.238732    3827 fix.go:112] recreateIfNeeded on ha-393000-m02: state=Stopped err=<nil>
	I0731 10:05:23.238750    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	W0731 10:05:23.238834    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:05:23.260015    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m02" ...
	I0731 10:05:23.302032    3827 main.go:141] libmachine: (ha-393000-m02) Calling .Start
	I0731 10:05:23.302368    3827 main.go:141] libmachine: (ha-393000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid
	I0731 10:05:23.302393    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.304220    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid 3703 missing from process table
	I0731 10:05:23.304235    3827 main.go:141] libmachine: (ha-393000-m02) DBG | pid 3703 is in state "Stopped"
	I0731 10:05:23.304257    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid...
	I0731 10:05:23.304590    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Using UUID 78632ed2-a86c-443c-8e8d-bbd47bc06547
	I0731 10:05:23.331752    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Generated MAC d6:c5:55:d7:1e:6a
	I0731 10:05:23.331774    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:05:23.331901    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331928    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"78632ed2-a86c-443c-8e8d-bbd47bc06547", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003c2fc0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:05:23.331992    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "78632ed2-a86c-443c-8e8d-bbd47bc06547", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:05:23.332030    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 78632ed2-a86c-443c-8e8d-bbd47bc06547 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/ha-393000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:05:23.332051    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:05:23.333566    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 DEBUG: hyperkit: Pid is 3849
	I0731 10:05:23.333951    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Attempt 0
	I0731 10:05:23.333966    3827 main.go:141] libmachine: (ha-393000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:23.334032    3827 main.go:141] libmachine: (ha-393000-m02) DBG | hyperkit pid from json: 3849
	I0731 10:05:23.335680    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Searching for d6:c5:55:d7:1e:6a in /var/db/dhcpd_leases ...
	I0731 10:05:23.335745    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:05:23.335779    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:05:23.335790    3827 main.go:141] libmachine: (ha-393000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abbf52}
	I0731 10:05:23.335796    3827 main.go:141] libmachine: (ha-393000-m02) DBG | Found match: d6:c5:55:d7:1e:6a
	I0731 10:05:23.335803    3827 main.go:141] libmachine: (ha-393000-m02) DBG | IP: 192.169.0.6
	I0731 10:05:23.335842    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetConfigRaw
	I0731 10:05:23.336526    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:23.336703    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:05:23.337199    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:05:23.337210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:23.337324    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:23.337431    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:23.337536    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:23.337761    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:23.337898    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:23.338051    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:23.338058    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:05:23.341501    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:05:23.350236    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:05:23.351301    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.351321    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.351333    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.351364    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.736116    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:05:23.736132    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:05:23.851173    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:05:23.851191    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:05:23.851204    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:05:23.851217    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:05:23.852083    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:05:23.852399    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:23 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:05:29.408102    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:05:29.408171    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:05:29.408180    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:05:29.431671    3827 main.go:141] libmachine: (ha-393000-m02) DBG | 2024/07/31 10:05:29 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:05:34.400446    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:05:34.400461    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400584    3827 buildroot.go:166] provisioning hostname "ha-393000-m02"
	I0731 10:05:34.400595    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.400705    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.400796    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.400890    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.400963    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.401039    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.401181    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.401327    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.401336    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m02 && echo "ha-393000-m02" | sudo tee /etc/hostname
	I0731 10:05:34.470038    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m02
	
	I0731 10:05:34.470053    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.470199    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.470327    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470407    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.470489    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.470615    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.470762    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.470773    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:05:34.535872    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:05:34.535890    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:05:34.535899    3827 buildroot.go:174] setting up certificates
	I0731 10:05:34.535905    3827 provision.go:84] configureAuth start
	I0731 10:05:34.535911    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetMachineName
	I0731 10:05:34.536042    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:34.536141    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.536239    3827 provision.go:143] copyHostCerts
	I0731 10:05:34.536274    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536323    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:05:34.536328    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:05:34.536441    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:05:34.536669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536701    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:05:34.536706    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:05:34.536812    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:05:34.536958    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.536987    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:05:34.536992    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:05:34.537061    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:05:34.537222    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m02 san=[127.0.0.1 192.169.0.6 ha-393000-m02 localhost minikube]
	I0731 10:05:34.648982    3827 provision.go:177] copyRemoteCerts
	I0731 10:05:34.649040    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:05:34.649057    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.649198    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.649295    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.649402    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.649489    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:34.683701    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:05:34.683772    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:05:34.703525    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:05:34.703596    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:05:34.722548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:05:34.722624    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:05:34.742309    3827 provision.go:87] duration metric: took 206.391288ms to configureAuth
	I0731 10:05:34.742322    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:05:34.742483    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:34.742496    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:34.742630    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.742723    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.742814    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742903    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.742982    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.743099    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.743260    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.743269    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:05:34.800092    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:05:34.800106    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:05:34.800191    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:05:34.800203    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.800330    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.800415    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800506    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.800591    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.800702    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.800838    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.800885    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:05:34.869190    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:05:34.869210    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:34.869342    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:34.869439    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869544    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:34.869626    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:34.869780    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:34.869920    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:34.869935    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:05:36.520454    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:05:36.520469    3827 machine.go:97] duration metric: took 13.183263325s to provisionDockerMachine
	I0731 10:05:36.520479    3827 start.go:293] postStartSetup for "ha-393000-m02" (driver="hyperkit")
	I0731 10:05:36.520499    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:05:36.520508    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.520691    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:05:36.520702    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.520789    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.520884    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.520979    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.521066    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.561300    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:05:36.564926    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:05:36.564938    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:05:36.565027    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:05:36.565170    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:05:36.565176    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:05:36.565342    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:05:36.574123    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:36.603284    3827 start.go:296] duration metric: took 82.788869ms for postStartSetup
	I0731 10:05:36.603307    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.603494    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:05:36.603509    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.603613    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.603706    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.603803    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.603903    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.639240    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:05:36.639297    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:05:36.692559    3827 fix.go:56] duration metric: took 13.466165097s for fixHost
	I0731 10:05:36.692585    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.692728    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.692817    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692901    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.692991    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.693111    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:05:36.693255    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.6 22 <nil> <nil>}
	I0731 10:05:36.693263    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:05:36.752606    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445536.868457526
	
	I0731 10:05:36.752619    3827 fix.go:216] guest clock: 1722445536.868457526
	I0731 10:05:36.752626    3827 fix.go:229] Guest: 2024-07-31 10:05:36.868457526 -0700 PDT Remote: 2024-07-31 10:05:36.692574 -0700 PDT m=+34.358830009 (delta=175.883526ms)
	I0731 10:05:36.752636    3827 fix.go:200] guest clock delta is within tolerance: 175.883526ms
	I0731 10:05:36.752640    3827 start.go:83] releasing machines lock for "ha-393000-m02", held for 13.526270601s
	I0731 10:05:36.752657    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.752793    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:36.777379    3827 out.go:177] * Found network options:
	I0731 10:05:36.798039    3827 out.go:177]   - NO_PROXY=192.169.0.5
	W0731 10:05:36.819503    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.819540    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820385    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820643    3827 main.go:141] libmachine: (ha-393000-m02) Calling .DriverName
	I0731 10:05:36.820770    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:05:36.820818    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	W0731 10:05:36.820878    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:05:36.820996    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:05:36.821009    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821024    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHHostname
	I0731 10:05:36.821247    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821250    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHPort
	I0731 10:05:36.821474    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821525    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHKeyPath
	I0731 10:05:36.821664    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	I0731 10:05:36.821739    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetSSHUsername
	I0731 10:05:36.821918    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m02/id_rsa Username:docker}
	W0731 10:05:36.854335    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:05:36.854406    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:05:36.901302    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:05:36.901324    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:36.901422    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:36.917770    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:05:36.926621    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:05:36.935218    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:05:36.935259    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:05:36.943879    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.952873    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:05:36.961710    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:05:36.970281    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:05:36.979176    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:05:36.987922    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:05:36.996548    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:05:37.005349    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:05:37.013281    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:05:37.020977    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.118458    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:05:37.137862    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:05:37.137937    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:05:37.153588    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.167668    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:05:37.181903    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:05:37.192106    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.202268    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:05:37.223314    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:05:37.233629    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:05:37.248658    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:05:37.251547    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:05:37.258758    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:05:37.272146    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:05:37.371218    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:05:37.472623    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:05:37.472648    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:05:37.486639    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:37.587113    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:05:39.947283    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.360151257s)
	I0731 10:05:39.947347    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:05:39.958391    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:05:39.972060    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:39.983040    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:05:40.085475    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:05:40.202062    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.302654    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:05:40.316209    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:05:40.326252    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:40.418074    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:05:40.482758    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:05:40.482836    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:05:40.487561    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:05:40.487613    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:05:40.491035    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:05:40.518347    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:05:40.518420    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.537051    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:05:40.576384    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:05:40.597853    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:05:40.618716    3827 main.go:141] libmachine: (ha-393000-m02) Calling .GetIP
	I0731 10:05:40.618993    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:05:40.622501    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:40.631917    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:05:40.632085    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:40.632302    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.632324    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.640887    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52034
	I0731 10:05:40.641227    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.641546    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.641557    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.641784    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.641900    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:05:40.641993    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:05:40.642069    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:05:40.643035    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:05:40.643318    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:05:40.643340    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:05:40.651868    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52036
	I0731 10:05:40.652209    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:05:40.652562    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:05:40.652581    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:05:40.652781    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:05:40.652890    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:05:40.652982    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.6
	I0731 10:05:40.652988    3827 certs.go:194] generating shared ca certs ...
	I0731 10:05:40.653003    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:05:40.653135    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:05:40.653190    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:05:40.653199    3827 certs.go:256] generating profile certs ...
	I0731 10:05:40.653301    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:05:40.653388    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.59c17652
	I0731 10:05:40.653436    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:05:40.653443    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:05:40.653468    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:05:40.653489    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:05:40.653510    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:05:40.653529    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:05:40.653548    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:05:40.653566    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:05:40.653584    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:05:40.653667    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:05:40.653713    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:05:40.653722    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:05:40.653755    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:05:40.653790    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:05:40.653819    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:05:40.653897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:05:40.653931    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:05:40.653957    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:40.653976    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:05:40.654001    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:05:40.654103    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:05:40.654205    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:05:40.654295    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:05:40.654382    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:05:40.686134    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 10:05:40.689771    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:05:40.697866    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 10:05:40.700957    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:05:40.708798    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:05:40.711973    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:05:40.719794    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:05:40.722937    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:05:40.731558    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:05:40.734708    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:05:40.742535    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 10:05:40.745692    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:05:40.753969    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:05:40.774721    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:05:40.793621    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:05:40.813481    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:05:40.833191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:05:40.853099    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:05:40.872942    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:05:40.892952    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:05:40.912690    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:05:40.932438    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:05:40.952459    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:05:40.971059    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:05:40.984708    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:05:40.998235    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:05:41.011745    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:05:41.025144    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:05:41.038794    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:05:41.052449    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:05:41.066415    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:05:41.070679    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:05:41.078894    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082206    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.082237    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:05:41.086362    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:05:41.094634    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:05:41.103040    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106511    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.106559    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:05:41.110939    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:05:41.119202    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:05:41.127421    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.130783    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:05:41.134845    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:05:41.142958    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:05:41.146291    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:05:41.150662    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:05:41.154843    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:05:41.159061    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:05:41.163240    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:05:41.167541    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:05:41.171729    3827 kubeadm.go:934] updating node {m02 192.169.0.6 8443 v1.30.3 docker true true} ...
	I0731 10:05:41.171784    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.6
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:05:41.171806    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:05:41.171838    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:05:41.184093    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:05:41.184125    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:05:41.184181    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:05:41.191780    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:05:41.191825    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:05:41.199155    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:05:41.212419    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:05:41.225964    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:05:41.239859    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:05:41.242661    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:05:41.251855    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.345266    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.360525    3827 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.169.0.6 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:05:41.360751    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:05:41.382214    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:05:41.402932    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:05:41.525126    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:05:41.539502    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:05:41.539699    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:05:41.539742    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:05:41.539934    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:41.540009    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:41.540015    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:41.540022    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:41.540026    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.017427    3827 round_trippers.go:574] Response Status: 200 OK in 8477 milliseconds
	I0731 10:05:50.018648    3827 node_ready.go:49] node "ha-393000-m02" has status "Ready":"True"
	I0731 10:05:50.018662    3827 node_ready.go:38] duration metric: took 8.478709659s for node "ha-393000-m02" to be "Ready" ...
	I0731 10:05:50.018668    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:05:50.018717    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:05:50.018723    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.018731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.018737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.028704    3827 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0731 10:05:50.043501    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.043562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:05:50.043568    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.043574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.043579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.049258    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.050015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.050025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.050031    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.050035    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.066794    3827 round_trippers.go:574] Response Status: 200 OK in 16 milliseconds
	I0731 10:05:50.067093    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.067103    3827 pod_ready.go:81] duration metric: took 23.584491ms for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067110    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.067150    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:05:50.067155    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.067161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.067170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.072229    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.072653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.072662    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.072674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.072678    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076158    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.076475    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.076487    3827 pod_ready.go:81] duration metric: took 9.372147ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076494    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.076536    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:05:50.076541    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.076547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.076551    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079467    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.079849    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.079858    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.079866    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.079871    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.086323    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:05:50.086764    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.086775    3827 pod_ready.go:81] duration metric: took 10.276448ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086782    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.086839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:05:50.086846    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.086852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.086861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.090747    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:50.091293    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:50.091301    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.091306    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.091310    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.093538    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.094155    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.094165    3827 pod_ready.go:81] duration metric: took 7.376399ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094171    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.094209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:05:50.094214    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.094220    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.094223    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.096892    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.219826    3827 request.go:629] Waited for 122.388601ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219867    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:05:50.219876    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.219882    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.219887    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.222303    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.222701    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:05:50.222710    3827 pod_ready.go:81] duration metric: took 128.533092ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.222720    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.419341    3827 request.go:629] Waited for 196.517978ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419372    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:05:50.419376    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.419382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.419386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.424561    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:05:50.619242    3827 request.go:629] Waited for 194.143472ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619333    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:05:50.619339    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.619346    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.619350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.622245    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:50.622550    3827 pod_ready.go:97] node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622563    3827 pod_ready.go:81] duration metric: took 399.836525ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	E0731 10:05:50.622570    3827 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-393000" hosting pod "kube-apiserver-ha-393000" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-393000" has status "Ready":"False"
	I0731 10:05:50.622575    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:05:50.819353    3827 request.go:629] Waited for 196.739442ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:50.819433    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:50.819438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:50.819447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:50.822809    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:51.019387    3827 request.go:629] Waited for 196.0195ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.019480    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.019488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.019494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.021643    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.220184    3827 request.go:629] Waited for 96.247837ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220254    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.220260    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.220266    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.220271    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.222468    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.419702    3827 request.go:629] Waited for 196.732028ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419735    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.419739    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.419746    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.419749    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.422018    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.622851    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:51.622865    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.622870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.622873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.625570    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:51.818923    3827 request.go:629] Waited for 192.647007ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:51.818971    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:51.818977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:51.818981    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:51.821253    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.123108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.123124    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.123133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.123137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.125336    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.220188    3827 request.go:629] Waited for 94.282602ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220295    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.220306    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.220317    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.220325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.223136    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:52.623123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:52.623202    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.623217    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.623227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.626259    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:52.626893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:52.626903    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:52.626912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:52.626916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:52.628416    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:52.628799    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:53.124413    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.124432    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.124441    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.124446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.127045    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.127494    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.127501    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.127511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.127514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.129223    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:53.623065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:53.623121    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.623133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.623142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626047    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:53.626707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:53.626717    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:53.626725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:53.626729    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:53.628447    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:54.123646    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.123761    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.123778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.123788    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.127286    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:54.128015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.128025    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.128033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.128038    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.130101    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.623229    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:54.623244    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.623253    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.623266    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.625325    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:54.625780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:54.625788    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:54.625794    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:54.625798    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:54.627218    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.123298    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.123318    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.123329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.123334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.126495    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:55.127199    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.127207    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.127213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.127217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.128585    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:55.128968    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:55.623994    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:55.624008    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.624016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.624021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.626813    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:55.627329    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:55.627336    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:55.627342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:55.627345    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:55.628805    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.123118    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.123195    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.123210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.123231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.126276    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:56.126864    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.126872    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.126877    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.126881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.128479    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:56.623814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:56.623924    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.623942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.623953    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.626841    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:56.627450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:56.627457    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:56.627463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:56.627467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:56.628844    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.124173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.124250    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.124262    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.124287    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.127734    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:57.128370    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.128377    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.128383    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.128386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.130108    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:57.130481    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:05:57.624004    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:57.624033    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.624093    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.624103    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.627095    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:57.628522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:57.628533    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:57.628541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:57.628547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:57.630446    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.123493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.123505    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.123512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.123514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.125506    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.126108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.126116    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.126121    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.126124    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.127991    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:58.623114    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:58.623141    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.623216    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.626428    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:58.627173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:58.627181    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:58.627187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:58.627191    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:58.628749    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.123212    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.123231    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.123243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.123249    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.126584    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:05:59.127100    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.127110    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.127118    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.127123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.129080    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.624707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:05:59.624736    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.624808    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.624814    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.627710    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:05:59.628543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:05:59.628550    3827 round_trippers.go:469] Request Headers:
	I0731 10:05:59.628556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:05:59.628560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:05:59.630077    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:05:59.630437    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:00.123863    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.123878    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.123885    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.123888    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.125761    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.126237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.126245    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.126251    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.126254    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.127937    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.623226    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:00.623240    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.623246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.623249    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625210    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:00.625691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:00.625699    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:00.625704    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:00.625708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:00.627280    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.124705    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.124804    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.124820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.124830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.127445    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.127933    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.127941    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.127947    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.127950    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.129462    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:01.623718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:01.623731    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.623736    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.623739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.625948    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:01.626336    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:01.626344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:01.626349    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:01.626352    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:01.627901    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.124021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.124081    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.124088    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.124092    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.125801    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.126187    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.126195    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.126200    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.126204    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.127656    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:02.127974    3827 pod_ready.go:102] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:02.623206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:02.623222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.623228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.623232    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.626774    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:02.627381    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:02.627389    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:02.627395    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:02.627400    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:02.630037    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.122889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.122980    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.122991    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.122997    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.125539    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:03.125964    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.125972    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.125976    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.125991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.129847    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.623340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:03.623368    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.623379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.623386    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.626892    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:03.627517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:03.627524    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:03.627530    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:03.627532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:03.629281    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.123967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:04.124007    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.124016    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.124021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.126604    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.127104    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.127111    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.127116    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.127131    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.128806    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.129260    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.129268    3827 pod_ready.go:81] duration metric: took 13.506690115s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129277    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.129312    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:04.129317    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.129323    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.129328    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.131506    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.131966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.131974    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.131980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.131984    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.133464    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.133963    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.133974    3827 pod_ready.go:81] duration metric: took 4.690553ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.133981    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.134013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:04.134018    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.134023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.134028    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.136093    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.136498    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:04.136506    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.136512    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.136515    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.138480    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.138864    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.138874    3827 pod_ready.go:81] duration metric: took 4.887644ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138882    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.138917    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:04.138922    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.138928    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.138932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.140760    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.141121    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.141129    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.141134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.141137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.143127    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.143455    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.143464    3827 pod_ready.go:81] duration metric: took 4.577275ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143471    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.143508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:04.143513    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.143519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.143523    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.145638    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.145987    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.145994    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.146000    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.146003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.147718    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.148046    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.148055    3827 pod_ready.go:81] duration metric: took 4.578507ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.148061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.325414    3827 request.go:629] Waited for 177.298505ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:04.325544    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.325555    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.325563    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.328825    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.525753    3827 request.go:629] Waited for 196.338568ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:04.525817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.525828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.525836    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.529114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:04.529604    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.529616    3827 pod_ready.go:81] duration metric: took 381.550005ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.529625    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.724886    3827 request.go:629] Waited for 195.165832ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:04.724931    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.724937    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.724942    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.726934    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:04.924942    3827 request.go:629] Waited for 197.623557ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924972    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:04.924977    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:04.924984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:04.924987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:04.927056    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:04.927556    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:04.927566    3827 pod_ready.go:81] duration metric: took 397.934888ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:04.927572    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.124719    3827 request.go:629] Waited for 197.081968ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:05.124767    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.124774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.124777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.126705    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.324036    3827 request.go:629] Waited for 196.854241ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.324136    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.324144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.324151    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.326450    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:05.326831    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.326840    3827 pod_ready.go:81] duration metric: took 399.263993ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.326854    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.525444    3827 request.go:629] Waited for 198.543186ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:05.525484    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.525490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.525494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.527459    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:05.724382    3827 request.go:629] Waited for 196.465154ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:05.724505    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.724516    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.724528    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.727650    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:05.728134    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:05.728147    3827 pod_ready.go:81] duration metric: took 401.285988ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.728155    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:05.925067    3827 request.go:629] Waited for 196.808438ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:05.925127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:05.925137    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:05.925147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:05.928198    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.125772    3827 request.go:629] Waited for 196.79397ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125895    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:06.125907    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.125918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.125924    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.129114    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.129535    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.129548    3827 pod_ready.go:81] duration metric: took 401.386083ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.129557    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.324601    3827 request.go:629] Waited for 194.995432ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:06.324718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.324729    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.324736    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.327699    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.524056    3827 request.go:629] Waited for 195.918056ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524164    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:06.524175    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.524186    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.524192    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.527800    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:06.528245    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:06.528255    3827 pod_ready.go:81] duration metric: took 398.692914ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:06.528262    3827 pod_ready.go:38] duration metric: took 16.509588377s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:06.528282    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:06.528341    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:06.541572    3827 api_server.go:72] duration metric: took 25.181024878s to wait for apiserver process to appear ...
	I0731 10:06:06.541584    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:06.541605    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:06.544968    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:06.545011    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:06.545016    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.545023    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.545027    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.545730    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:06.545799    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:06.545808    3827 api_server.go:131] duration metric: took 4.219553ms to wait for apiserver health ...
	I0731 10:06:06.545813    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:06.724899    3827 request.go:629] Waited for 179.053526ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724936    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:06.724942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.724948    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.724951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.733411    3827 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0731 10:06:06.742910    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:06.742937    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:06.742945    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:06.742950    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:06.742953    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:06.742958    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:06.742961    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:06.742963    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:06.742966    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:06.742968    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:06.742971    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:06.742973    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:06.742977    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:06.742981    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:06.742984    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:06.742986    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:06.742989    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:06.742991    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:06.742995    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:06.742998    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:06.743001    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:06.743003    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Pending
	I0731 10:06:06.743006    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:06.743010    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:06.743012    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:06.743017    3827 system_pods.go:74] duration metric: took 197.200154ms to wait for pod list to return data ...
	I0731 10:06:06.743023    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:06.925020    3827 request.go:629] Waited for 181.949734ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:06.925067    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:06.925076    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:06.925081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:06.927535    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:06.927730    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:06.927740    3827 default_sa.go:55] duration metric: took 184.712762ms for default service account to be created ...
	I0731 10:06:06.927745    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:07.125051    3827 request.go:629] Waited for 197.272072ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:07.125090    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.125100    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.125104    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.129975    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:07.134630    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:07.134648    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134654    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running / Ready:ContainersNotReady (containers with unready status: [coredns]) / ContainersReady:ContainersNotReady (containers with unready status: [coredns])
	I0731 10:06:07.134659    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:07.134663    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:07.134666    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:07.134671    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running / Ready:ContainersNotReady (containers with unready status: [kindnet-cni]) / ContainersReady:ContainersNotReady (containers with unready status: [kindnet-cni])
	I0731 10:06:07.134675    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:07.134679    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:07.134683    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:07.134705    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:07.134712    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:07.134718    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0731 10:06:07.134723    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:07.134728    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:07.134731    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:07.134735    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:07.134739    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0731 10:06:07.134743    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:07.134747    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:07.134751    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:07.134755    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:07.134764    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:07.134768    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:07.134772    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0731 10:06:07.134781    3827 system_pods.go:126] duration metric: took 207.030567ms to wait for k8s-apps to be running ...
	I0731 10:06:07.134786    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:07.134841    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:07.148198    3827 system_svc.go:56] duration metric: took 13.406485ms WaitForService to wait for kubelet
	I0731 10:06:07.148215    3827 kubeadm.go:582] duration metric: took 25.78766951s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:07.148230    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:07.324197    3827 request.go:629] Waited for 175.905806ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:07.324232    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:07.324238    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:07.324243    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:07.329946    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:07.330815    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330830    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330840    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330843    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330847    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:07.330850    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:07.330853    3827 node_conditions.go:105] duration metric: took 182.619551ms to run NodePressure ...
	I0731 10:06:07.330860    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:07.330878    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:07.352309    3827 out.go:177] 
	I0731 10:06:07.373528    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:07.373631    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.433500    3827 out.go:177] * Starting "ha-393000-m03" control-plane node in "ha-393000" cluster
	I0731 10:06:07.475236    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:07.475262    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:07.475398    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:07.475412    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:07.475498    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.476024    3827 start.go:360] acquireMachinesLock for ha-393000-m03: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:07.476077    3827 start.go:364] duration metric: took 40.57µs to acquireMachinesLock for "ha-393000-m03"
	I0731 10:06:07.476090    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:07.476095    3827 fix.go:54] fixHost starting: m03
	I0731 10:06:07.476337    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:07.476357    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:07.485700    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52041
	I0731 10:06:07.486069    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:07.486427    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:07.486449    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:07.486677    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:07.486797    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.486888    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetState
	I0731 10:06:07.486969    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.487057    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 2994
	I0731 10:06:07.488010    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.488031    3827 fix.go:112] recreateIfNeeded on ha-393000-m03: state=Stopped err=<nil>
	I0731 10:06:07.488039    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	W0731 10:06:07.488129    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:07.525270    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m03" ...
	I0731 10:06:07.583189    3827 main.go:141] libmachine: (ha-393000-m03) Calling .Start
	I0731 10:06:07.583357    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.583398    3827 main.go:141] libmachine: (ha-393000-m03) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid
	I0731 10:06:07.584444    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid 2994 missing from process table
	I0731 10:06:07.584457    3827 main.go:141] libmachine: (ha-393000-m03) DBG | pid 2994 is in state "Stopped"
	I0731 10:06:07.584473    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Removing stale pid file /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid...
	I0731 10:06:07.584622    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Using UUID 451d6bef-97a7-42a6-8ccb-b8851dda0594
	I0731 10:06:07.614491    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Generated MAC 3e:56:a2:18:e2:4c
	I0731 10:06:07.614519    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:07.614662    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614709    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"451d6bef-97a7-42a6-8ccb-b8851dda0594", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003b11a0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:07.614792    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "451d6bef-97a7-42a6-8ccb-b8851dda0594", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:07.614841    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 451d6bef-97a7-42a6-8ccb-b8851dda0594 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/ha-393000-m03.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:07.614865    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:07.616508    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 DEBUG: hyperkit: Pid is 3858
	I0731 10:06:07.617000    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Attempt 0
	I0731 10:06:07.617017    3827 main.go:141] libmachine: (ha-393000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:07.617185    3827 main.go:141] libmachine: (ha-393000-m03) DBG | hyperkit pid from json: 3858
	I0731 10:06:07.619558    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Searching for 3e:56:a2:18:e2:4c in /var/db/dhcpd_leases ...
	I0731 10:06:07.619621    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:07.619647    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:07.619664    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:07.619685    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:07.619703    3827 main.go:141] libmachine: (ha-393000-m03) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abbe0e}
	I0731 10:06:07.619712    3827 main.go:141] libmachine: (ha-393000-m03) DBG | Found match: 3e:56:a2:18:e2:4c
	I0731 10:06:07.619727    3827 main.go:141] libmachine: (ha-393000-m03) DBG | IP: 192.169.0.7
	I0731 10:06:07.619755    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetConfigRaw
	I0731 10:06:07.620809    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:07.621055    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:07.621590    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:07.621602    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:07.621745    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:07.621861    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:07.621957    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622061    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:07.622150    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:07.622290    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:07.622460    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:07.622469    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:07.625744    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:07.635188    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:07.636453    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:07.636476    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:07.636488    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:07.636503    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:07 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.026194    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:08.026210    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:08.141380    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:08.141403    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:08.141420    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:08.141430    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:08.142228    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:08.142237    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:08 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:13.717443    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:13.717596    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:13.717612    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:13.741129    3827 main.go:141] libmachine: (ha-393000-m03) DBG | 2024/07/31 10:06:13 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:06:18.682578    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:06:18.682599    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682767    3827 buildroot.go:166] provisioning hostname "ha-393000-m03"
	I0731 10:06:18.682779    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.682866    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.682981    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.683070    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683166    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.683267    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.683412    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.683571    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.683581    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m03 && echo "ha-393000-m03" | sudo tee /etc/hostname
	I0731 10:06:18.749045    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m03
	
	I0731 10:06:18.749064    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.749190    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.749278    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749369    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.749454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.749565    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.749706    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.749722    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:06:18.806865    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:06:18.806883    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:06:18.806892    3827 buildroot.go:174] setting up certificates
	I0731 10:06:18.806898    3827 provision.go:84] configureAuth start
	I0731 10:06:18.806904    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetMachineName
	I0731 10:06:18.807035    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:18.807129    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.807209    3827 provision.go:143] copyHostCerts
	I0731 10:06:18.807236    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807287    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:06:18.807293    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:06:18.807440    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:06:18.807654    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807687    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:06:18.807691    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:06:18.807798    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:06:18.807946    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.807978    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:06:18.807983    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:06:18.808051    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:06:18.808199    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m03 san=[127.0.0.1 192.169.0.7 ha-393000-m03 localhost minikube]
	I0731 10:06:18.849388    3827 provision.go:177] copyRemoteCerts
	I0731 10:06:18.849440    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:06:18.849454    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.849608    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.849706    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.849793    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.849878    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:18.882927    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:06:18.883001    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:06:18.902836    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:06:18.902904    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:06:18.922711    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:06:18.922778    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0731 10:06:18.943709    3827 provision.go:87] duration metric: took 136.803232ms to configureAuth
	I0731 10:06:18.943724    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:06:18.943896    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:18.943910    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:18.944075    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.944168    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.944245    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944342    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.944422    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.944538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.944665    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.944672    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:06:18.996744    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:06:18.996756    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:06:18.996829    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:06:18.996840    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:18.996972    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:18.997082    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997171    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:18.997252    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:18.997394    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:18.997538    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:18.997587    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:06:19.061774    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:06:19.061792    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:19.061924    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:19.062001    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062094    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:19.062183    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:19.062322    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:19.062475    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:19.062487    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:06:20.667693    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:06:20.667709    3827 machine.go:97] duration metric: took 13.046112735s to provisionDockerMachine
	I0731 10:06:20.667718    3827 start.go:293] postStartSetup for "ha-393000-m03" (driver="hyperkit")
	I0731 10:06:20.667725    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:06:20.667738    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.667939    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:06:20.667954    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.668063    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.668167    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.668260    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.668365    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.711043    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:06:20.714520    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:06:20.714533    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:06:20.714632    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:06:20.714782    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:06:20.714789    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:06:20.714971    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:06:20.725237    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:20.756197    3827 start.go:296] duration metric: took 88.463878ms for postStartSetup
	I0731 10:06:20.756221    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.756402    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:06:20.756417    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.756509    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.756594    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.756688    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.756757    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:20.788829    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:06:20.788889    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:06:20.841715    3827 fix.go:56] duration metric: took 13.365618842s for fixHost
	I0731 10:06:20.841743    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:20.841878    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:20.841982    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842069    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:20.842155    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:20.842314    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:20.842486    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.7 22 <nil> <nil>}
	I0731 10:06:20.842494    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:06:20.895743    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445580.896263750
	
	I0731 10:06:20.895763    3827 fix.go:216] guest clock: 1722445580.896263750
	I0731 10:06:20.895768    3827 fix.go:229] Guest: 2024-07-31 10:06:20.89626375 -0700 PDT Remote: 2024-07-31 10:06:20.841731 -0700 PDT m=+78.507993684 (delta=54.53275ms)
	I0731 10:06:20.895779    3827 fix.go:200] guest clock delta is within tolerance: 54.53275ms
	I0731 10:06:20.895783    3827 start.go:83] releasing machines lock for "ha-393000-m03", held for 13.419701289s
	I0731 10:06:20.895800    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:20.895930    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:20.933794    3827 out.go:177] * Found network options:
	I0731 10:06:21.008361    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6
	W0731 10:06:21.029193    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.029220    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.029239    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.029902    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030149    3827 main.go:141] libmachine: (ha-393000-m03) Calling .DriverName
	I0731 10:06:21.030274    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:06:21.030303    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	W0731 10:06:21.030372    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:06:21.030402    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:06:21.030458    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030487    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:06:21.030508    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHHostname
	I0731 10:06:21.030615    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHPort
	I0731 10:06:21.030657    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030724    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHKeyPath
	I0731 10:06:21.030782    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030837    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetSSHUsername
	I0731 10:06:21.030887    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	I0731 10:06:21.030941    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m03/id_rsa Username:docker}
	W0731 10:06:21.060481    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:06:21.060548    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:06:21.113024    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:06:21.113039    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.113103    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.128523    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:06:21.136837    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:06:21.145325    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.145388    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:06:21.153686    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.162021    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:06:21.170104    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:06:21.178345    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:06:21.186720    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:06:21.195003    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:06:21.203212    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:06:21.211700    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:06:21.219303    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:06:21.226730    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.333036    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:06:21.355400    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:06:21.355468    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:06:21.370793    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.382599    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:06:21.397116    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:06:21.408366    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.419500    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:06:21.441593    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:06:21.453210    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:06:21.468638    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:06:21.471686    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:06:21.480107    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:06:21.493473    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:06:21.590098    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:06:21.695002    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:06:21.695025    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:06:21.709644    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:21.804799    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:06:24.090859    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.286034061s)
	I0731 10:06:24.090921    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:06:24.102085    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:06:24.115631    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.125950    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:06:24.222193    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:06:24.332843    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.449689    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:06:24.463232    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:06:24.474652    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:24.567486    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:06:24.631150    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:06:24.631230    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:06:24.635708    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:06:24.635764    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:06:24.638929    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:06:24.666470    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:06:24.666542    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.686587    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:06:24.729344    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:06:24.771251    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:06:24.792172    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:06:24.813314    3827 main.go:141] libmachine: (ha-393000-m03) Calling .GetIP
	I0731 10:06:24.813703    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:06:24.818215    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:24.828147    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:06:24.828324    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:24.828531    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.828552    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.837259    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52063
	I0731 10:06:24.837609    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.837954    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.837967    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.838165    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.838272    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:06:24.838349    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:24.838424    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:06:24.839404    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:06:24.839647    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:24.839672    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:24.848293    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52065
	I0731 10:06:24.848630    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:24.848982    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:24.848999    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:24.849191    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:24.849297    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:06:24.849393    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.7
	I0731 10:06:24.849399    3827 certs.go:194] generating shared ca certs ...
	I0731 10:06:24.849408    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:06:24.849551    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:06:24.849606    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:06:24.849615    3827 certs.go:256] generating profile certs ...
	I0731 10:06:24.849710    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key
	I0731 10:06:24.849799    3827 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key.01a63cdb
	I0731 10:06:24.849848    3827 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key
	I0731 10:06:24.849860    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:06:24.849881    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:06:24.849901    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:06:24.849920    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:06:24.849937    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0731 10:06:24.849955    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0731 10:06:24.849974    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0731 10:06:24.849991    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0731 10:06:24.850072    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:06:24.850109    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:06:24.850118    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:06:24.850152    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:06:24.850184    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:06:24.850218    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:06:24.850285    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:06:24.850322    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:06:24.850344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:06:24.850366    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:24.850395    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHHostname
	I0731 10:06:24.850485    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHPort
	I0731 10:06:24.850565    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHKeyPath
	I0731 10:06:24.850653    3827 main.go:141] libmachine: (ha-393000) Calling .GetSSHUsername
	I0731 10:06:24.850732    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.5 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000/id_rsa Username:docker}
	I0731 10:06:24.882529    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0731 10:06:24.886785    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0731 10:06:24.896598    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0731 10:06:24.900384    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0731 10:06:24.910269    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0731 10:06:24.914011    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0731 10:06:24.922532    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0731 10:06:24.925784    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0731 10:06:24.936850    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0731 10:06:24.940321    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0731 10:06:24.950026    3827 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0731 10:06:24.953055    3827 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0731 10:06:24.962295    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:06:24.982990    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:06:25.003016    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:06:25.022822    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:06:25.043864    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0731 10:06:25.064140    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0731 10:06:25.084546    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0731 10:06:25.105394    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0731 10:06:25.125890    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:06:25.146532    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:06:25.166742    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:06:25.186545    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0731 10:06:25.200206    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0731 10:06:25.214106    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0731 10:06:25.228037    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0731 10:06:25.242065    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0731 10:06:25.255847    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0731 10:06:25.269574    3827 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0731 10:06:25.283881    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:06:25.288466    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:06:25.297630    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301289    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.301331    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:06:25.305714    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:06:25.314348    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:06:25.322967    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326578    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.326634    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:06:25.330926    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:06:25.339498    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:06:25.348151    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351535    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.351576    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:06:25.355921    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:06:25.364535    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:06:25.368077    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0731 10:06:25.372428    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0731 10:06:25.376757    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0731 10:06:25.380980    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0731 10:06:25.385296    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0731 10:06:25.389606    3827 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0731 10:06:25.393857    3827 kubeadm.go:934] updating node {m03 192.169.0.7 8443 v1.30.3 docker true true} ...
	I0731 10:06:25.393914    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.7
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:06:25.393928    3827 kube-vip.go:115] generating kube-vip config ...
	I0731 10:06:25.393959    3827 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0731 10:06:25.405786    3827 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0731 10:06:25.405830    3827 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.169.0.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0731 10:06:25.405888    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:06:25.414334    3827 binaries.go:44] Found k8s binaries, skipping transfer
	I0731 10:06:25.414379    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0731 10:06:25.422310    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:06:25.435970    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:06:25.449652    3827 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1440 bytes)
	I0731 10:06:25.463392    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:06:25.466266    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:06:25.476391    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.572265    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.587266    3827 start.go:235] Will wait 6m0s for node &{Name:m03 IP:192.169.0.7 Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0731 10:06:25.587454    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:25.609105    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:06:25.650600    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:06:25.776520    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:06:25.790838    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:06:25.791048    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:06:25.791095    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:06:25.791257    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.791299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:25.791305    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.791311    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.791315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.793351    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.793683    3827 node_ready.go:49] node "ha-393000-m03" has status "Ready":"True"
	I0731 10:06:25.793693    3827 node_ready.go:38] duration metric: took 2.426331ms for node "ha-393000-m03" to be "Ready" ...
	I0731 10:06:25.793700    3827 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:25.793737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:25.793742    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.793753    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.793758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.797877    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:25.803934    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:25.803995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:25.804000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.804007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.804011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.806477    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:25.806997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:25.807005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:25.807011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:25.807014    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:25.808989    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.304983    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.304998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.305006    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.305010    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.307209    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:26.307839    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.307846    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.307852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.307861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.309644    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:26.805493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:26.805510    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.805520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.805527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.821394    3827 round_trippers.go:574] Response Status: 200 OK in 15 milliseconds
	I0731 10:06:26.822205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:26.822215    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:26.822221    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:26.822224    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:26.827160    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:27.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.305839    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.305846    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308258    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.308744    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.308752    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.308758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.308761    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.310974    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.805552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:27.805567    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.805574    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.805578    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.807860    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.808403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:27.808410    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:27.808416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:27.808419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:27.810436    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:27.810811    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:28.305577    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.305593    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.305600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.305604    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.311583    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:28.312446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.312455    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.312461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.312465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.314779    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.804391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:28.804407    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.804414    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.804420    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.806848    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:28.807227    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:28.807235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:28.807241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:28.807244    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:28.809171    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:29.305552    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.305615    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.305624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.305629    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.308134    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.308891    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.308900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.308906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.308909    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.311098    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.805109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:29.805127    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.805192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.805198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.807898    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:29.808285    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:29.808292    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:29.808297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:29.808300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:29.810154    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.305017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.305032    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.305045    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.305048    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.307205    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.307776    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.307783    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.307789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.307792    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.309771    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:30.310293    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:30.805366    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:30.805428    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.805436    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.805440    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.807864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:30.808309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:30.808316    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:30.808322    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:30.808325    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:30.810111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.305667    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.305700    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.305708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.305712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308126    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:31.308539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.308546    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.308552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.308556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.310279    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:31.804975    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:31.805002    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.805014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.805020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.808534    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:31.809053    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:31.809061    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:31.809066    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:31.809069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:31.810955    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.304759    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.304815    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.304830    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.304839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.308267    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.308684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.308692    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.308698    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.308701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.310475    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:32.310804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:32.805138    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:32.805163    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.805175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.805181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.808419    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:32.809125    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:32.809133    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:32.809139    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:32.809143    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:32.810741    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.305088    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.305103    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.305109    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.305113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.307495    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.307998    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.308005    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.308011    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.308015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.309595    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:33.806000    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:33.806021    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.806049    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.806056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.808625    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:33.809248    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:33.809259    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:33.809264    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:33.809269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:33.810758    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.305752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.305832    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.305847    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.305853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.308868    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.309591    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.309599    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.309605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.309608    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.311263    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:34.311627    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:34.804923    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:34.804948    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.804959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.804965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.808036    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:34.808636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:34.808646    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:34.808654    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:34.808670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:34.810398    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.305879    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.305966    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.305982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.305991    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.309016    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:35.309584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.309592    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.309598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.309601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.311155    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:35.804092    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:35.804107    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.804114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.804117    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.806476    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:35.806988    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:35.806997    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:35.807002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:35.807025    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:35.808897    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.305921    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.305943    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.305951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.305955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.308670    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:36.309170    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.309178    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.309184    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.309199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.310943    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.805015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:36.805085    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.805098    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.805106    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.808215    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:36.808810    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:36.808817    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:36.808823    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:36.808827    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:36.810482    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:36.810768    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:37.305031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.305055    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.305068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.305077    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.308209    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:37.308934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.308942    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.308947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.308951    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.310514    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:37.805625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:37.805671    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.805682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.805687    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808188    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:37.808728    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:37.808735    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:37.808741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:37.808744    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:37.810288    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.305824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.305838    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.305845    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.305848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.307926    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.308378    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.308386    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.308391    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.308395    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.310092    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:38.805380    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:38.805397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.805406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.805410    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.807819    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:38.808368    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:38.808376    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:38.808382    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:38.808385    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:38.809904    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.305804    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.305820    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.305826    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.305830    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.307991    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.308527    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.308535    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.308541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.308546    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.310495    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:39.310929    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:39.806108    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:39.806122    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.806129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.806132    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.808192    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:39.808709    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:39.808718    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:39.808727    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:39.808730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:39.810476    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.304101    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.304125    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.304137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.304144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307004    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.307629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.307637    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.307643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.307646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.309373    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:40.804289    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:40.804302    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.804329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.804334    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.806678    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:40.807320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:40.807328    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:40.807334    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:40.807338    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:40.809111    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.305710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.305762    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.305770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.305774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.307795    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.308244    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.308252    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.308258    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.310033    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:41.805219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:41.805235    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.805242    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.805246    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.807574    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.808103    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:41.808112    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:41.808119    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:41.808123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:41.810305    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:41.810720    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:42.305509    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.305569    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.305580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.305586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.307774    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:42.308154    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.308161    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.308167    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.308170    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.309895    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:42.804631    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:42.804655    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.804667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.804687    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.808080    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:42.808852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:42.808863    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:42.808869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:42.808874    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:42.811059    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.304116    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.304217    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.304233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.304239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.306879    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.307340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.307348    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.307354    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.307358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.308948    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.805920    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:43.805934    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.805981    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.805986    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.808009    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:43.808576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:43.808583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:43.808589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:43.808592    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:43.810282    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:43.810804    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:44.304703    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.304728    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.304798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.304823    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.308376    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.308780    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.308787    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.308793    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.308797    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.310396    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:44.805218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:44.805242    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.805255    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.805264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.808404    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:44.808967    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:44.808978    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:44.808986    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:44.808990    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:44.810748    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.304672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.304770    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.304784    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.304791    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.307754    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:45.308249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.308256    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.308261    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.308265    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.309903    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:45.804236    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:45.804265    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.804276    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.804281    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.807605    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:45.808214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:45.808222    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:45.808228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:45.808231    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:45.810076    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.305660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.305674    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.305723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.305727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.307959    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.308389    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.308397    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.308403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.308406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.310188    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:46.310668    3827 pod_ready.go:102] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"False"
	I0731 10:06:46.805585    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:46.805685    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.805700    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.805708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.808399    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:46.808892    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:46.808900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:46.808910    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:46.808914    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:46.810397    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.304911    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-5m8st
	I0731 10:06:47.304926    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.304933    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.304936    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.307282    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.307761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.307768    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.307774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.307777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.309541    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.309921    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.309931    3827 pod_ready.go:81] duration metric: took 21.505983976s for pod "coredns-7db6d8ff4d-5m8st" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309937    3827 pod_ready.go:78] waiting up to 6m0s for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.309966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/coredns-7db6d8ff4d-wvqjl
	I0731 10:06:47.309971    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.309977    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.309980    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.311547    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.311995    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.312003    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.312009    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.312013    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.313414    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.313802    3827 pod_ready.go:92] pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.313811    3827 pod_ready.go:81] duration metric: took 3.869093ms for pod "coredns-7db6d8ff4d-wvqjl" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313818    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.313850    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000
	I0731 10:06:47.313855    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.313861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.313865    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.315523    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.315938    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.315947    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.315955    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.315959    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.317522    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.317922    3827 pod_ready.go:92] pod "etcd-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.317931    3827 pod_ready.go:81] duration metric: took 4.10711ms for pod "etcd-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317937    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.317971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m02
	I0731 10:06:47.317976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.317982    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.317985    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319520    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.319893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:47.319900    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.319906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.319909    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321439    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.321816    3827 pod_ready.go:92] pod "etcd-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.321825    3827 pod_ready.go:81] duration metric: took 3.88293ms for pod "etcd-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321832    3827 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.321862    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/etcd-ha-393000-m03
	I0731 10:06:47.321867    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.321872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.321876    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.323407    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.323756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:47.323763    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.323769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.323773    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.325384    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:47.325703    3827 pod_ready.go:92] pod "etcd-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.325712    3827 pod_ready.go:81] duration metric: took 3.875112ms for pod "etcd-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.325727    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.505410    3827 request.go:629] Waited for 179.649549ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000
	I0731 10:06:47.505454    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.505462    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.505467    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.508003    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.705861    3827 request.go:629] Waited for 197.38651ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:47.705976    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.705987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.705997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.708863    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:47.709477    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:47.709486    3827 pod_ready.go:81] duration metric: took 383.754198ms for pod "kube-apiserver-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.709493    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:47.905743    3827 request.go:629] Waited for 196.205437ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905783    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m02
	I0731 10:06:47.905790    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:47.905812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:47.905826    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:47.908144    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.106945    3827 request.go:629] Waited for 198.217758ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106991    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:48.106998    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.107017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.107023    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.109503    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.109889    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.109898    3827 pod_ready.go:81] duration metric: took 400.399458ms for pod "kube-apiserver-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.109910    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.306502    3827 request.go:629] Waited for 196.553294ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-393000-m03
	I0731 10:06:48.306583    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.306589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.306593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.308907    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.506077    3827 request.go:629] Waited for 196.82354ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:48.506180    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.506189    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.506195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.508341    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:48.508805    3827 pod_ready.go:92] pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.508814    3827 pod_ready.go:81] duration metric: took 398.898513ms for pod "kube-apiserver-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.508829    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.706656    3827 request.go:629] Waited for 197.780207ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706753    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000
	I0731 10:06:48.706765    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.706776    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.706784    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.709960    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.906621    3827 request.go:629] Waited for 195.987746ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906714    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:48.906726    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:48.906737    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:48.906744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:48.910100    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:48.910537    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:48.910550    3827 pod_ready.go:81] duration metric: took 401.715473ms for pod "kube-controller-manager-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:48.910559    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.106125    3827 request.go:629] Waited for 195.518023ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106250    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m02
	I0731 10:06:49.106262    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.106273    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.106280    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.109411    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.306599    3827 request.go:629] Waited for 196.360989ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:49.306730    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.306741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.306747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.309953    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:49.310311    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.310320    3827 pod_ready.go:81] duration metric: took 399.753992ms for pod "kube-controller-manager-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.310327    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.505092    3827 request.go:629] Waited for 194.718659ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505129    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-393000-m03
	I0731 10:06:49.505134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.505140    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.505144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.510347    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:49.706499    3827 request.go:629] Waited for 195.722594ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706547    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:49.706556    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.706623    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.706634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.709639    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:49.710039    3827 pod_ready.go:92] pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:49.710049    3827 pod_ready.go:81] duration metric: took 399.716837ms for pod "kube-controller-manager-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.710061    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:49.906378    3827 request.go:629] Waited for 196.280735ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cf577
	I0731 10:06:49.906418    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:49.906425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:49.906442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:49.911634    3827 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0731 10:06:50.106586    3827 request.go:629] Waited for 194.536585ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106637    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:50.106652    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.106717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.106725    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.109661    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.110176    3827 pod_ready.go:92] pod "kube-proxy-cf577" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.110189    3827 pod_ready.go:81] duration metric: took 400.121095ms for pod "kube-proxy-cf577" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.110197    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.306216    3827 request.go:629] Waited for 195.968962ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-cr9pg
	I0731 10:06:50.306286    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.306291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.306301    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.308314    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:50.505180    3827 request.go:629] Waited for 196.336434ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505320    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:50.505332    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.505344    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.505351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.508601    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.509059    3827 pod_ready.go:92] pod "kube-proxy-cr9pg" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.509072    3827 pod_ready.go:81] duration metric: took 398.868353ms for pod "kube-proxy-cr9pg" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.509081    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.705014    3827 request.go:629] Waited for 195.886159ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-proxy-zc52f
	I0731 10:06:50.705134    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.705144    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.705151    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.708274    3827 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0731 10:06:50.906912    3827 request.go:629] Waited for 198.179332ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906985    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:50.906991    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:50.906997    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:50.907002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:50.908938    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:50.909509    3827 pod_ready.go:92] pod "kube-proxy-zc52f" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:50.909519    3827 pod_ready.go:81] duration metric: took 400.431581ms for pod "kube-proxy-zc52f" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:50.909525    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.106576    3827 request.go:629] Waited for 197.012349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000
	I0731 10:06:51.106668    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.106677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.106682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.109021    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.305894    3827 request.go:629] Waited for 196.495089ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.305945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000
	I0731 10:06:51.306000    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.306010    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.306018    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.308864    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.309301    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.309311    3827 pod_ready.go:81] duration metric: took 399.779835ms for pod "kube-scheduler-ha-393000" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.309324    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.504969    3827 request.go:629] Waited for 195.610894ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m02
	I0731 10:06:51.505066    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.505072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.505076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.507056    3827 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0731 10:06:51.705447    3827 request.go:629] Waited for 197.942219ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m02
	I0731 10:06:51.705515    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.705522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.705527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.707999    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:51.708367    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:51.708379    3827 pod_ready.go:81] duration metric: took 399.049193ms for pod "kube-scheduler-ha-393000-m02" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.708391    3827 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:51.906469    3827 request.go:629] Waited for 198.035792ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-393000-m03
	I0731 10:06:51.906531    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:51.906539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:51.906545    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:51.909082    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.105038    3827 request.go:629] Waited for 195.597271ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105087    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m03
	I0731 10:06:52.105095    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.105157    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.105168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.108049    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.108591    3827 pod_ready.go:92] pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace has status "Ready":"True"
	I0731 10:06:52.108604    3827 pod_ready.go:81] duration metric: took 400.204131ms for pod "kube-scheduler-ha-393000-m03" in "kube-system" namespace to be "Ready" ...
	I0731 10:06:52.108615    3827 pod_ready.go:38] duration metric: took 26.314911332s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0731 10:06:52.108628    3827 api_server.go:52] waiting for apiserver process to appear ...
	I0731 10:06:52.108680    3827 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:06:52.120989    3827 api_server.go:72] duration metric: took 26.533695803s to wait for apiserver process to appear ...
	I0731 10:06:52.121002    3827 api_server.go:88] waiting for apiserver healthz status ...
	I0731 10:06:52.121014    3827 api_server.go:253] Checking apiserver healthz at https://192.169.0.5:8443/healthz ...
	I0731 10:06:52.124310    3827 api_server.go:279] https://192.169.0.5:8443/healthz returned 200:
	ok
	I0731 10:06:52.124340    3827 round_trippers.go:463] GET https://192.169.0.5:8443/version
	I0731 10:06:52.124344    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.124353    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.124358    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.124912    3827 round_trippers.go:574] Response Status: 200 OK in 0 milliseconds
	I0731 10:06:52.124978    3827 api_server.go:141] control plane version: v1.30.3
	I0731 10:06:52.124989    3827 api_server.go:131] duration metric: took 3.981645ms to wait for apiserver health ...
	I0731 10:06:52.124994    3827 system_pods.go:43] waiting for kube-system pods to appear ...
	I0731 10:06:52.305762    3827 request.go:629] Waited for 180.72349ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305845    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.305853    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.305861    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.305872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.310548    3827 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0731 10:06:52.315274    3827 system_pods.go:59] 24 kube-system pods found
	I0731 10:06:52.315286    3827 system_pods.go:61] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.315289    3827 system_pods.go:61] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.315292    3827 system_pods.go:61] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.315295    3827 system_pods.go:61] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.315298    3827 system_pods.go:61] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.315301    3827 system_pods.go:61] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.315303    3827 system_pods.go:61] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.315306    3827 system_pods.go:61] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.315311    3827 system_pods.go:61] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.315313    3827 system_pods.go:61] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.315316    3827 system_pods.go:61] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.315319    3827 system_pods.go:61] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.315322    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.315327    3827 system_pods.go:61] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.315330    3827 system_pods.go:61] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.315333    3827 system_pods.go:61] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.315335    3827 system_pods.go:61] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.315338    3827 system_pods.go:61] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.315341    3827 system_pods.go:61] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.315343    3827 system_pods.go:61] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.315346    3827 system_pods.go:61] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.315348    3827 system_pods.go:61] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.315350    3827 system_pods.go:61] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.315353    3827 system_pods.go:61] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.315358    3827 system_pods.go:74] duration metric: took 190.3593ms to wait for pod list to return data ...
	I0731 10:06:52.315363    3827 default_sa.go:34] waiting for default service account to be created ...
	I0731 10:06:52.505103    3827 request.go:629] Waited for 189.702061ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505178    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/default/serviceaccounts
	I0731 10:06:52.505187    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.505195    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.505199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.507558    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.507636    3827 default_sa.go:45] found service account: "default"
	I0731 10:06:52.507644    3827 default_sa.go:55] duration metric: took 192.276446ms for default service account to be created ...
	I0731 10:06:52.507666    3827 system_pods.go:116] waiting for k8s-apps to be running ...
	I0731 10:06:52.705427    3827 request.go:629] Waited for 197.710286ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/namespaces/kube-system/pods
	I0731 10:06:52.705497    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.705519    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.705526    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.711904    3827 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0731 10:06:52.716760    3827 system_pods.go:86] 24 kube-system pods found
	I0731 10:06:52.716772    3827 system_pods.go:89] "coredns-7db6d8ff4d-5m8st" [38affffc-51df-4613-b507-5ec8a04bad69] Running
	I0731 10:06:52.716777    3827 system_pods.go:89] "coredns-7db6d8ff4d-wvqjl" [1e9019db-4958-4bd4-bb4f-37031d5bd43d] Running
	I0731 10:06:52.716780    3827 system_pods.go:89] "etcd-ha-393000" [0a2d8313-3b67-466d-9350-ee72484084b0] Running
	I0731 10:06:52.716783    3827 system_pods.go:89] "etcd-ha-393000-m02" [c180fab1-1045-4386-b23e-13daaebb0786] Running
	I0731 10:06:52.716787    3827 system_pods.go:89] "etcd-ha-393000-m03" [d3a74c79-2d9e-4816-ad71-54cbd3eacabc] Running
	I0731 10:06:52.716790    3827 system_pods.go:89] "kindnet-hjm7c" [6a771cfe-2f5e-40d9-b339-dd4fc889b9a6] Running
	I0731 10:06:52.716794    3827 system_pods.go:89] "kindnet-lcwbs" [1bb6e6a4-4cb7-4e6a-b30b-75592d16e675] Running
	I0731 10:06:52.716798    3827 system_pods.go:89] "kindnet-s2pv6" [458e7e9d-e3ca-4f1b-a4d0-75a94ad66aff] Running
	I0731 10:06:52.716801    3827 system_pods.go:89] "kube-apiserver-ha-393000" [f8187910-ec7b-4a73-9324-937e773c7e04] Running
	I0731 10:06:52.716805    3827 system_pods.go:89] "kube-apiserver-ha-393000-m02" [03c22efa-0ea2-4873-8e31-acfb01c8dc5f] Running
	I0731 10:06:52.716809    3827 system_pods.go:89] "kube-apiserver-ha-393000-m03" [c16c529f-64bb-40fb-8589-fdf7e72ab008] Running
	I0731 10:06:52.716813    3827 system_pods.go:89] "kube-controller-manager-ha-393000" [36582bcb-355b-4634-8de6-f380d2a4abca] Running
	I0731 10:06:52.716816    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m02" [18e1b299-9ee8-4416-95e3-0a77eec2348b] Running
	I0731 10:06:52.716819    3827 system_pods.go:89] "kube-controller-manager-ha-393000-m03" [1389cb68-d2fe-4ef8-9ff5-34da67bae48e] Running
	I0731 10:06:52.716823    3827 system_pods.go:89] "kube-proxy-cf577" [5029c2c3-45ca-49c7-bbf1-e33aae76bb09] Running
	I0731 10:06:52.716827    3827 system_pods.go:89] "kube-proxy-cr9pg" [debcdc56-8f82-4472-b306-59baed6966e3] Running
	I0731 10:06:52.716830    3827 system_pods.go:89] "kube-proxy-zc52f" [5265965e-2c9e-4a74-b18f-8a0cc13110f6] Running
	I0731 10:06:52.716833    3827 system_pods.go:89] "kube-scheduler-ha-393000" [35101c62-4c00-4e51-b4d7-54ec4447b236] Running
	I0731 10:06:52.716836    3827 system_pods.go:89] "kube-scheduler-ha-393000-m02" [ed14c671-16c4-4a7d-9e48-9670078a0e3e] Running
	I0731 10:06:52.716854    3827 system_pods.go:89] "kube-scheduler-ha-393000-m03" [b7094682-327f-4640-8ec9-0b88cb04167f] Running
	I0731 10:06:52.716860    3827 system_pods.go:89] "kube-vip-ha-393000" [d64869a1-a7a1-45ea-9baf-80e466c2de02] Running
	I0731 10:06:52.716864    3827 system_pods.go:89] "kube-vip-ha-393000-m02" [3523ddf2-a1d7-4493-ba80-7fd6286bd47c] Running
	I0731 10:06:52.716867    3827 system_pods.go:89] "kube-vip-ha-393000-m03" [063bd145-fb06-42aa-ac69-4ea8ed4b2cdf] Running
	I0731 10:06:52.716871    3827 system_pods.go:89] "storage-provisioner" [a59b97ca-f030-4c73-b4db-00b444d39095] Running
	I0731 10:06:52.716876    3827 system_pods.go:126] duration metric: took 209.203713ms to wait for k8s-apps to be running ...
	I0731 10:06:52.716881    3827 system_svc.go:44] waiting for kubelet service to be running ....
	I0731 10:06:52.716936    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:06:52.731223    3827 system_svc.go:56] duration metric: took 14.33545ms WaitForService to wait for kubelet
	I0731 10:06:52.731240    3827 kubeadm.go:582] duration metric: took 27.143948309s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0731 10:06:52.731255    3827 node_conditions.go:102] verifying NodePressure condition ...
	I0731 10:06:52.906178    3827 request.go:629] Waited for 174.879721ms due to client-side throttling, not priority and fairness, request: GET:https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906213    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes
	I0731 10:06:52.906218    3827 round_trippers.go:469] Request Headers:
	I0731 10:06:52.906257    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:06:52.906264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:06:52.908378    3827 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0731 10:06:52.909014    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909025    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909032    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909035    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909039    3827 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0731 10:06:52.909041    3827 node_conditions.go:123] node cpu capacity is 2
	I0731 10:06:52.909045    3827 node_conditions.go:105] duration metric: took 177.780993ms to run NodePressure ...
	I0731 10:06:52.909053    3827 start.go:241] waiting for startup goroutines ...
	I0731 10:06:52.909067    3827 start.go:255] writing updated cluster config ...
	I0731 10:06:52.931184    3827 out.go:177] 
	I0731 10:06:52.952773    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:06:52.952858    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:52.974676    3827 out.go:177] * Starting "ha-393000-m04" worker node in "ha-393000" cluster
	I0731 10:06:53.016553    3827 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 10:06:53.016583    3827 cache.go:56] Caching tarball of preloaded images
	I0731 10:06:53.016766    3827 preload.go:172] Found /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0731 10:06:53.016784    3827 cache.go:59] Finished verifying existence of preloaded tar for v1.30.3 on docker
	I0731 10:06:53.016901    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.017869    3827 start.go:360] acquireMachinesLock for ha-393000-m04: {Name:mk07dc778be495a1e14d9470c0e6621dd3133dbc Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0731 10:06:53.017982    3827 start.go:364] duration metric: took 90.107µs to acquireMachinesLock for "ha-393000-m04"
	I0731 10:06:53.018005    3827 start.go:96] Skipping create...Using existing machine configuration
	I0731 10:06:53.018013    3827 fix.go:54] fixHost starting: m04
	I0731 10:06:53.018399    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:06:53.018423    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:06:53.027659    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52069
	I0731 10:06:53.028033    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:06:53.028349    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:06:53.028359    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:06:53.028586    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:06:53.028695    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.028810    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetState
	I0731 10:06:53.028891    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.028978    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3095
	I0731 10:06:53.029947    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid 3095 missing from process table
	I0731 10:06:53.029967    3827 fix.go:112] recreateIfNeeded on ha-393000-m04: state=Stopped err=<nil>
	I0731 10:06:53.029982    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	W0731 10:06:53.030076    3827 fix.go:138] unexpected machine state, will restart: <nil>
	I0731 10:06:53.051730    3827 out.go:177] * Restarting existing hyperkit VM for "ha-393000-m04" ...
	I0731 10:06:53.093566    3827 main.go:141] libmachine: (ha-393000-m04) Calling .Start
	I0731 10:06:53.093954    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.094004    3827 main.go:141] libmachine: (ha-393000-m04) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid
	I0731 10:06:53.094113    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Using UUID 8a49f5e0-ba79-41ac-9a76-c032dc065628
	I0731 10:06:53.120538    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Generated MAC d2:d8:fb:1d:1:ee
	I0731 10:06:53.120559    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000
	I0731 10:06:53.120750    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120805    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"8a49f5e0-ba79-41ac-9a76-c032dc065628", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc00032a1e0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage", Initrd:"/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0731 10:06:53.120864    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "8a49f5e0-ba79-41ac-9a76-c032dc065628", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machine
s/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"}
	I0731 10:06:53.120909    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 8a49f5e0-ba79-41ac-9a76-c032dc065628 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/ha-393000-m04.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/tty,log=/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/console-ring -f kexec,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/bzimage,/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-393000"
	I0731 10:06:53.120925    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0731 10:06:53.122259    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 DEBUG: hyperkit: Pid is 3870
	I0731 10:06:53.122766    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Attempt 0
	I0731 10:06:53.122781    3827 main.go:141] libmachine: (ha-393000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:06:53.122872    3827 main.go:141] libmachine: (ha-393000-m04) DBG | hyperkit pid from json: 3870
	I0731 10:06:53.125179    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Searching for d2:d8:fb:1d:1:ee in /var/db/dhcpd_leases ...
	I0731 10:06:53.125242    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found 7 entries in /var/db/dhcpd_leases!
	I0731 10:06:53.125254    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.7 HWAddress:3e:56:a2:18:e2:4c ID:1,3e:56:a2:18:e2:4c Lease:0x66abc088}
	I0731 10:06:53.125266    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.6 HWAddress:d6:c5:55:d7:1e:6a ID:1,d6:c5:55:d7:1e:6a Lease:0x66abc05c}
	I0731 10:06:53.125273    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.5 HWAddress:9e:7:8b:23:9c:e3 ID:1,9e:7:8b:23:9c:e3 Lease:0x66abc048}
	I0731 10:06:53.125280    3827 main.go:141] libmachine: (ha-393000-m04) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:d8:fb:1d:1:ee ID:1,d2:d8:fb:1d:1:ee Lease:0x66aa6d9b}
	I0731 10:06:53.125287    3827 main.go:141] libmachine: (ha-393000-m04) DBG | Found match: d2:d8:fb:1d:1:ee
	I0731 10:06:53.125295    3827 main.go:141] libmachine: (ha-393000-m04) DBG | IP: 192.169.0.8
	I0731 10:06:53.125358    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetConfigRaw
	I0731 10:06:53.126014    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:06:53.126188    3827 profile.go:143] Saving config to /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/config.json ...
	I0731 10:06:53.126707    3827 machine.go:94] provisionDockerMachine start ...
	I0731 10:06:53.126722    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:06:53.126959    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:06:53.127071    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:06:53.127158    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127274    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:06:53.127389    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:06:53.127538    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:06:53.127705    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:06:53.127713    3827 main.go:141] libmachine: About to run SSH command:
	hostname
	I0731 10:06:53.131247    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0731 10:06:53.140131    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0731 10:06:53.141373    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.141406    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.141429    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.141447    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.528683    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0731 10:06:53.528699    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0731 10:06:53.643451    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0731 10:06:53.643474    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0731 10:06:53.643483    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0731 10:06:53.643491    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0731 10:06:53.644344    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0731 10:06:53.644357    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:53 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0731 10:06:59.241509    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0731 10:06:59.241622    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0731 10:06:59.241636    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0731 10:06:59.265250    3827 main.go:141] libmachine: (ha-393000-m04) DBG | 2024/07/31 10:06:59 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0731 10:07:04.190144    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0731 10:07:04.190159    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190326    3827 buildroot.go:166] provisioning hostname "ha-393000-m04"
	I0731 10:07:04.190338    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.190427    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.190528    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.190617    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190711    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.190826    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.190962    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.191110    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.191119    3827 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-393000-m04 && echo "ha-393000-m04" | sudo tee /etc/hostname
	I0731 10:07:04.259087    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-393000-m04
	
	I0731 10:07:04.259102    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.259236    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.259339    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259439    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.259526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.259647    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.259797    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.259811    3827 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-393000-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-393000-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-393000-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0731 10:07:04.323580    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0731 10:07:04.323604    3827 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/19349-1046/.minikube CaCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/19349-1046/.minikube}
	I0731 10:07:04.323616    3827 buildroot.go:174] setting up certificates
	I0731 10:07:04.323623    3827 provision.go:84] configureAuth start
	I0731 10:07:04.323630    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetMachineName
	I0731 10:07:04.323758    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:04.323858    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.323932    3827 provision.go:143] copyHostCerts
	I0731 10:07:04.323960    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324021    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem, removing ...
	I0731 10:07:04.324027    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem
	I0731 10:07:04.324150    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/key.pem (1679 bytes)
	I0731 10:07:04.324352    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324397    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem, removing ...
	I0731 10:07:04.324402    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem
	I0731 10:07:04.324482    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.pem (1078 bytes)
	I0731 10:07:04.324627    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324668    3827 exec_runner.go:144] found /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem, removing ...
	I0731 10:07:04.324674    3827 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem
	I0731 10:07:04.324752    3827 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/19349-1046/.minikube/cert.pem (1123 bytes)
	I0731 10:07:04.324900    3827 provision.go:117] generating server cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem org=jenkins.ha-393000-m04 san=[127.0.0.1 192.169.0.8 ha-393000-m04 localhost minikube]
	I0731 10:07:04.518738    3827 provision.go:177] copyRemoteCerts
	I0731 10:07:04.518793    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0731 10:07:04.518809    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.518951    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.519038    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.519124    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.519202    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:04.553750    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0731 10:07:04.553834    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0731 10:07:04.574235    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0731 10:07:04.574311    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0731 10:07:04.594359    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0731 10:07:04.594433    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0731 10:07:04.614301    3827 provision.go:87] duration metric: took 290.6663ms to configureAuth
	I0731 10:07:04.614319    3827 buildroot.go:189] setting minikube options for container-runtime
	I0731 10:07:04.614509    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:04.614526    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:04.614676    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.614777    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.614880    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.614987    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.615110    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.615236    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.615386    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.615394    3827 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0731 10:07:04.672493    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0731 10:07:04.672505    3827 buildroot.go:70] root file system type: tmpfs
	I0731 10:07:04.672600    3827 provision.go:314] Updating docker unit: /lib/systemd/system/docker.service ...
	I0731 10:07:04.672612    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.672752    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.672835    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.672958    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.673042    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.673159    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.673303    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.673352    3827 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment="NO_PROXY=192.169.0.5"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6"
	Environment="NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7"
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0731 10:07:04.741034    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	Environment=NO_PROXY=192.169.0.5
	Environment=NO_PROXY=192.169.0.5,192.169.0.6
	Environment=NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0731 10:07:04.741052    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:04.741187    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:04.741288    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741387    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:04.741494    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:04.741621    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:04.741755    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:04.741771    3827 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0731 10:07:06.325916    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0731 10:07:06.325931    3827 machine.go:97] duration metric: took 13.199216588s to provisionDockerMachine
	I0731 10:07:06.325941    3827 start.go:293] postStartSetup for "ha-393000-m04" (driver="hyperkit")
	I0731 10:07:06.325948    3827 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0731 10:07:06.325960    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.326146    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0731 10:07:06.326163    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.326257    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.326346    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.326438    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.326522    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.369998    3827 ssh_runner.go:195] Run: cat /etc/os-release
	I0731 10:07:06.375343    3827 info.go:137] Remote host: Buildroot 2023.02.9
	I0731 10:07:06.375359    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/addons for local assets ...
	I0731 10:07:06.375470    3827 filesync.go:126] Scanning /Users/jenkins/minikube-integration/19349-1046/.minikube/files for local assets ...
	I0731 10:07:06.375663    3827 filesync.go:149] local asset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> 15912.pem in /etc/ssl/certs
	I0731 10:07:06.375669    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /etc/ssl/certs/15912.pem
	I0731 10:07:06.375894    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0731 10:07:06.394523    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:06.415884    3827 start.go:296] duration metric: took 89.928396ms for postStartSetup
	I0731 10:07:06.415906    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.416074    3827 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0731 10:07:06.416088    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.416193    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.416287    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.416381    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.416451    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.451487    3827 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0731 10:07:06.451545    3827 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0731 10:07:06.482558    3827 fix.go:56] duration metric: took 13.464545279s for fixHost
	I0731 10:07:06.482584    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.482724    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.482806    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482891    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.482992    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.483122    3827 main.go:141] libmachine: Using SSH client type: native
	I0731 10:07:06.483263    3827 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x3e000c0] 0x3e02e20 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0731 10:07:06.483270    3827 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0731 10:07:06.539713    3827 main.go:141] libmachine: SSH cmd err, output: <nil>: 1722445626.658160546
	
	I0731 10:07:06.539725    3827 fix.go:216] guest clock: 1722445626.658160546
	I0731 10:07:06.539731    3827 fix.go:229] Guest: 2024-07-31 10:07:06.658160546 -0700 PDT Remote: 2024-07-31 10:07:06.482574 -0700 PDT m=+124.148842929 (delta=175.586546ms)
	I0731 10:07:06.539746    3827 fix.go:200] guest clock delta is within tolerance: 175.586546ms
	I0731 10:07:06.539751    3827 start.go:83] releasing machines lock for "ha-393000-m04", held for 13.521760862s
	I0731 10:07:06.539766    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.539895    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:06.564336    3827 out.go:177] * Found network options:
	I0731 10:07:06.583958    3827 out.go:177]   - NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	W0731 10:07:06.605128    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605143    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605170    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605183    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605593    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605717    3827 main.go:141] libmachine: (ha-393000-m04) Calling .DriverName
	I0731 10:07:06.605786    3827 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0731 10:07:06.605816    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	W0731 10:07:06.605831    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605845    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	W0731 10:07:06.605864    3827 proxy.go:119] fail to check proxy env: Error ip not in block
	I0731 10:07:06.605930    3827 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0731 10:07:06.605931    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.605944    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHHostname
	I0731 10:07:06.606068    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606081    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHPort
	I0731 10:07:06.606172    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606197    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHKeyPath
	I0731 10:07:06.606270    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetSSHUsername
	I0731 10:07:06.606322    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	I0731 10:07:06.606369    3827 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/ha-393000-m04/id_rsa Username:docker}
	W0731 10:07:06.638814    3827 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0731 10:07:06.638878    3827 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0731 10:07:06.685734    3827 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0731 10:07:06.685752    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.685831    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:06.701869    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0731 10:07:06.710640    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0731 10:07:06.719391    3827 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0731 10:07:06.719452    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0731 10:07:06.728151    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.736695    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0731 10:07:06.745525    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0731 10:07:06.754024    3827 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0731 10:07:06.762489    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0731 10:07:06.770723    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0731 10:07:06.779179    3827 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0731 10:07:06.787524    3827 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0731 10:07:06.795278    3827 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0731 10:07:06.802833    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:06.908838    3827 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0731 10:07:06.929085    3827 start.go:495] detecting cgroup driver to use...
	I0731 10:07:06.929153    3827 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0731 10:07:06.946994    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.958792    3827 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0731 10:07:06.977007    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0731 10:07:06.987118    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:06.998383    3827 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0731 10:07:07.019497    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0731 10:07:07.030189    3827 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0731 10:07:07.045569    3827 ssh_runner.go:195] Run: which cri-dockerd
	I0731 10:07:07.048595    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0731 10:07:07.055870    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0731 10:07:07.070037    3827 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0731 10:07:07.166935    3827 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0731 10:07:07.272420    3827 docker.go:574] configuring docker to use "cgroupfs" as cgroup driver...
	I0731 10:07:07.272447    3827 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (130 bytes)
	I0731 10:07:07.286182    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:07.397807    3827 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0731 10:07:09.678871    3827 ssh_runner.go:235] Completed: sudo systemctl restart docker: (2.281044692s)
	I0731 10:07:09.678935    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.socket
	I0731 10:07:09.691390    3827 ssh_runner.go:195] Run: sudo systemctl stop cri-docker.socket
	I0731 10:07:09.706154    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:09.718281    3827 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0731 10:07:09.818061    3827 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0731 10:07:09.918372    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.020296    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0731 10:07:10.034132    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service cri-docker.service
	I0731 10:07:10.045516    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:10.140924    3827 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.service
	I0731 10:07:10.198542    3827 start.go:542] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0731 10:07:10.198622    3827 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0731 10:07:10.202939    3827 start.go:563] Will wait 60s for crictl version
	I0731 10:07:10.203007    3827 ssh_runner.go:195] Run: which crictl
	I0731 10:07:10.206254    3827 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0731 10:07:10.238107    3827 start.go:579] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  27.1.1
	RuntimeApiVersion:  v1
	I0731 10:07:10.238184    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.256129    3827 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0731 10:07:10.301307    3827 out.go:204] * Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...
	I0731 10:07:10.337880    3827 out.go:177]   - env NO_PROXY=192.169.0.5
	I0731 10:07:10.396169    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6
	I0731 10:07:10.454080    3827 out.go:177]   - env NO_PROXY=192.169.0.5,192.169.0.6,192.169.0.7
	I0731 10:07:10.491070    3827 main.go:141] libmachine: (ha-393000-m04) Calling .GetIP
	I0731 10:07:10.491478    3827 ssh_runner.go:195] Run: grep 192.169.0.1	host.minikube.internal$ /etc/hosts
	I0731 10:07:10.496573    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.169.0.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:10.506503    3827 mustload.go:65] Loading cluster: ha-393000
	I0731 10:07:10.506687    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:10.506931    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.506954    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.515949    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52091
	I0731 10:07:10.516322    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.516656    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.516668    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.516893    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.517004    3827 main.go:141] libmachine: (ha-393000) Calling .GetState
	I0731 10:07:10.517099    3827 main.go:141] libmachine: (ha-393000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:07:10.517181    3827 main.go:141] libmachine: (ha-393000) DBG | hyperkit pid from json: 3840
	I0731 10:07:10.518192    3827 host.go:66] Checking if "ha-393000" exists ...
	I0731 10:07:10.518454    3827 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:07:10.518477    3827 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:07:10.527151    3827 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52093
	I0731 10:07:10.527586    3827 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:07:10.527914    3827 main.go:141] libmachine: Using API Version  1
	I0731 10:07:10.527931    3827 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:07:10.528158    3827 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:07:10.528268    3827 main.go:141] libmachine: (ha-393000) Calling .DriverName
	I0731 10:07:10.528367    3827 certs.go:68] Setting up /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000 for IP: 192.169.0.8
	I0731 10:07:10.528374    3827 certs.go:194] generating shared ca certs ...
	I0731 10:07:10.528388    3827 certs.go:226] acquiring lock for ca certs: {Name:mk884e236a232b80dd837ceba2a70aca0fa86dc9 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0731 10:07:10.528576    3827 certs.go:235] skipping valid "minikubeCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key
	I0731 10:07:10.528655    3827 certs.go:235] skipping valid "proxyClientCA" ca cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key
	I0731 10:07:10.528666    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0731 10:07:10.528692    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0731 10:07:10.528712    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0731 10:07:10.528731    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0731 10:07:10.528834    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem (1338 bytes)
	W0731 10:07:10.528887    3827 certs.go:480] ignoring /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591_empty.pem, impossibly tiny 0 bytes
	I0731 10:07:10.528897    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca-key.pem (1675 bytes)
	I0731 10:07:10.528933    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/ca.pem (1078 bytes)
	I0731 10:07:10.528968    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/cert.pem (1123 bytes)
	I0731 10:07:10.529000    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/key.pem (1679 bytes)
	I0731 10:07:10.529077    3827 certs.go:484] found cert: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem (1708 bytes)
	I0731 10:07:10.529114    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem -> /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.529135    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.529152    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem -> /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.529176    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0731 10:07:10.550191    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0731 10:07:10.570588    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0731 10:07:10.590746    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0731 10:07:10.611034    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/ssl/certs/15912.pem --> /usr/share/ca-certificates/15912.pem (1708 bytes)
	I0731 10:07:10.631281    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0731 10:07:10.651472    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/certs/1591.pem --> /usr/share/ca-certificates/1591.pem (1338 bytes)
	I0731 10:07:10.671880    3827 ssh_runner.go:195] Run: openssl version
	I0731 10:07:10.676790    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/15912.pem && ln -fs /usr/share/ca-certificates/15912.pem /etc/ssl/certs/15912.pem"
	I0731 10:07:10.685541    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689430    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 31 16:49 /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.689496    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/15912.pem
	I0731 10:07:10.694391    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/15912.pem /etc/ssl/certs/3ec20f2e.0"
	I0731 10:07:10.703456    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0731 10:07:10.712113    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715734    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 31 16:40 /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.715795    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0731 10:07:10.720285    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0731 10:07:10.728964    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1591.pem && ln -fs /usr/share/ca-certificates/1591.pem /etc/ssl/certs/1591.pem"
	I0731 10:07:10.737483    3827 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741091    3827 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 31 16:49 /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.741135    3827 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1591.pem
	I0731 10:07:10.745570    3827 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1591.pem /etc/ssl/certs/51391683.0"
	I0731 10:07:10.754084    3827 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0731 10:07:10.757225    3827 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0731 10:07:10.757258    3827 kubeadm.go:934] updating node {m04 192.169.0.8 0 v1.30.3 docker false true} ...
	I0731 10:07:10.757327    3827 kubeadm.go:946] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.3/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-393000-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.169.0.8
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.3 ClusterName:ha-393000 Namespace:default APIServerHAVIP:192.169.0.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0731 10:07:10.757375    3827 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.764753    3827 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.3: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.3': No such file or directory
	
	Initiating transfer...
	I0731 10:07:10.764797    3827 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.3
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubeadm.sha256
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubectl.sha256
	I0731 10:07:10.772338    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm -> /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772344    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl -> /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.772322    3827 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.3/bin/linux/amd64/kubelet.sha256
	I0731 10:07:10.772398    3827 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:07:10.772434    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm
	I0731 10:07:10.772437    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl
	I0731 10:07:10.780324    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubectl': No such file or directory
	I0731 10:07:10.780354    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubeadm': No such file or directory
	I0731 10:07:10.780356    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubectl --> /var/lib/minikube/binaries/v1.30.3/kubectl (51454104 bytes)
	I0731 10:07:10.780369    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubeadm --> /var/lib/minikube/binaries/v1.30.3/kubeadm (50249880 bytes)
	I0731 10:07:10.799303    3827 vm_assets.go:164] NewFileAsset: /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet -> /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.799462    3827 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet
	I0731 10:07:10.842469    3827 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.3/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.3/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.3/kubelet': No such file or directory
	I0731 10:07:10.842511    3827 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/linux/amd64/v1.30.3/kubelet --> /var/lib/minikube/binaries/v1.30.3/kubelet (100125080 bytes)
	I0731 10:07:11.478912    3827 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0731 10:07:11.486880    3827 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (311 bytes)
	I0731 10:07:11.501278    3827 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0731 10:07:11.515550    3827 ssh_runner.go:195] Run: grep 192.169.0.254	control-plane.minikube.internal$ /etc/hosts
	I0731 10:07:11.518663    3827 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.169.0.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0731 10:07:11.528373    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.625133    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:11.645175    3827 start.go:235] Will wait 6m0s for node &{Name:m04 IP:192.169.0.8 Port:0 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:false Worker:true}
	I0731 10:07:11.645375    3827 config.go:182] Loaded profile config "ha-393000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:07:11.651211    3827 out.go:177] * Verifying Kubernetes components...
	I0731 10:07:11.692705    3827 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0731 10:07:11.797111    3827 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0731 10:07:12.534860    3827 loader.go:395] Config loaded from file:  /Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 10:07:12.535084    3827 kapi.go:59] client config for ha-393000: &rest.Config{Host:"https://192.169.0.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.crt", KeyFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/ha-393000/client.key", CAFile:"/Users/jenkins/minikube-integration/19349-1046/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x52a5660), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0731 10:07:12.535128    3827 kubeadm.go:483] Overriding stale ClientConfig host https://192.169.0.254:8443 with https://192.169.0.5:8443
	I0731 10:07:12.535291    3827 node_ready.go:35] waiting up to 6m0s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:07:12.535335    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:12.535339    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:12.535359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:12.535366    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:12.537469    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.035600    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.035613    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.035620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.035622    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.037811    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:13.536601    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:13.536621    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:13.536630    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:13.536636    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:13.539103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.035926    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.035943    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.035952    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.035957    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.038327    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.535691    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:14.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:14.535719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:14.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:14.538107    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:14.538174    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:15.035707    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.035739    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.037991    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:15.535587    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:15.535602    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:15.535658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:15.535663    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:15.537787    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.035475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.035497    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.035550    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.035555    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.037882    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.536666    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:16.536687    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:16.536712    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:16.536719    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:16.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:16.538904    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:17.035473    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.035488    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.035495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.035498    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.037610    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:17.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:17.536074    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:17.536089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:17.536096    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:17.539102    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.035624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.035646    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.035652    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.037956    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:18.535491    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:18.535589    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:18.535603    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:18.535610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:18.538819    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:18.538965    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:19.036954    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.037007    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.037028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.037033    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.039345    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:19.536847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:19.536862    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:19.536870    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:19.536873    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:19.538820    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.037064    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.037079    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.037086    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.037089    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.038945    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:20.536127    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:20.536138    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:20.536145    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:20.536150    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:20.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:21.036613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.036684    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.036695    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.036701    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.039123    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:21.039186    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:21.536684    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:21.536700    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:21.536705    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:21.536708    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:21.538918    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:22.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.036736    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.036743    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.036746    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.038627    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:22.536686    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:22.536704    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:22.536714    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:22.536718    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:22.538549    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:23.036470    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.036482    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.036489    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.036494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.038533    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:23.535581    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:23.535639    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:23.535653    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:23.535667    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:23.539678    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:23.539740    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:24.036874    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.036948    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.036959    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.036965    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.039843    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:24.536241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:24.536307    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:24.536318    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:24.536323    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:24.538807    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.036279    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.036343    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.036356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.036362    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.038454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:25.535942    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:25.535954    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:25.535962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:25.535967    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:25.538068    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.036823    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.036838    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.036845    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.036848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.038942    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:26.039008    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:26.535480    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:26.535499    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:26.535533    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:26.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:26.538039    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:27.036202    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.036213    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.036219    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.036222    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.038071    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:27.537206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:27.537226    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:27.537236    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:27.537248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:27.539573    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.036203    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.036217    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.036223    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.036225    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.038017    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:28.536971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:28.536988    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:28.536998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:28.537003    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:28.539378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:28.539442    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:29.035655    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.035667    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.035673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.035676    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.037786    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:29.537109    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:29.537124    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:29.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:29.537144    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:29.539430    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:30.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.035905    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.035908    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.037803    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:30.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:30.535701    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:30.535718    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:30.535723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:30.539029    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:31.036151    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.036166    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.036175    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.038532    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:31.038593    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:31.536698    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:31.536710    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:31.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:31.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:31.538484    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.037162    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.037178    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.037185    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.037188    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.039081    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:32.536065    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:32.536085    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:32.536095    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:32.536099    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:32.538365    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.036492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.036513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.036523    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.036527    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.038851    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:33.038919    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:33.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:33.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:33.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:33.535576    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:33.537575    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:34.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.036912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.036923    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.036932    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.040173    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:34.535858    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:34.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:34.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:34.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:34.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:35.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.036670    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.036677    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.036682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.038861    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:35.038930    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:35.535814    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:35.535827    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:35.535835    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:35.535840    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:35.538360    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.035769    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.035785    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.038202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:36.535426    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:36.535438    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:36.535445    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:36.535449    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:36.537303    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:37.035456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.035470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.035479    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.035483    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.037630    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.536548    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:37.536562    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:37.536568    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:37.536572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:37.538659    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:37.538720    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:38.036407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.036421    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.036427    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.036432    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.038467    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:38.537359    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:38.537378    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:38.537387    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:38.537392    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:38.539892    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:39.036414    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.036470    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.036495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:39.535817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:39.535832    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:39.535839    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:39.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:39.537796    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.035880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.035896    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.035906    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.037712    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:40.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:40.535492    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:40.535523    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:40.535536    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:40.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:40.538475    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:41.035745    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.035758    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.035770    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.035774    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:41.535726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:41.535738    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:41.535744    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:41.535747    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:41.537897    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.036564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.036573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.039525    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:42.039600    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:42.535450    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:42.535465    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:42.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:42.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:42.537399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:43.035576    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.035592    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.035598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.035602    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:43.536787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:43.536822    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:43.536832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:43.536837    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:43.539146    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.036148    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.036161    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.036169    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.036173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.038382    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:44.536653    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:44.536709    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:44.536717    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:44.536720    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:44.538695    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:44.538753    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:45.036650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.036662    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.036668    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.036672    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.038555    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:45.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:45.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:45.535582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:45.535590    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:45.538335    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.035712    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.035726    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.035735    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.035740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.038035    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:46.535534    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:46.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:46.535557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:46.535564    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:46.537974    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:47.035871    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.035887    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.035893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.035897    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.037864    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:47.037931    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:47.535553    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:47.535564    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:47.535570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:47.535573    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:47.537590    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:48.035461    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.035531    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.035539    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.035543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.037510    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:48.536520    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:48.536535    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:48.536541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:48.536544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:48.538561    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:49.035436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.035448    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.035454    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.035458    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.037204    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.535574    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:49.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:49.535592    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:49.535595    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:49.537443    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:49.537505    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:50.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.036547    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.036566    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.038478    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:50.536624    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:50.536636    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:50.536642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:50.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:50.538734    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.036016    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.036035    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.036044    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.036049    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.038643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.536662    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:51.536677    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:51.536686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:51.536691    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:51.539033    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:51.539099    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:52.036475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.036490    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.036499    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.036503    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.038975    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:52.537013    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:52.537034    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:52.537041    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:52.537045    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:52.539229    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.037093    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.037106    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.037113    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.037117    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.039169    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:53.536468    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:53.536478    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:53.536486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:53.539425    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:53.539565    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:54.035597    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.035609    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.035615    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.037574    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:54.535484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:54.535503    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:54.535509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:54.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:54.537529    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.036258    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.036270    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.036277    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.036280    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.038186    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:55.536493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:55.536513    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:55.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:55.536533    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:55.539517    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:55.539589    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:56.035565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.035586    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.035599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.040006    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:07:56.536361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:56.536374    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:56.536380    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:56.536383    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:56.538540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:57.036446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.036544    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.036567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.039754    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:57.536620    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:57.536630    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:57.536637    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:57.536639    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:57.538482    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:07:58.036499    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.036518    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.036527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.036532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.039244    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:58.039325    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:07:58.537076    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:58.537105    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:58.537197    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:58.537204    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:58.539718    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:07:59.037046    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.037127    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.037142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.037149    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.040197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:07:59.536758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:07:59.536790    3827 round_trippers.go:469] Request Headers:
	I0731 10:07:59.536798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:07:59.536802    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:07:59.538842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.035440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.035453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.035460    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.035463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.037506    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:00.536873    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:00.536895    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:00.536906    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:00.536913    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:00.540041    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:00.540123    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:01.036175    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.036225    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.036239    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.036248    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.039214    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:01.535960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:01.535973    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:01.535979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:01.535983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:01.538089    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.036835    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.036856    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.036875    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.039802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:02.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:02.536660    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:02.536667    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:02.536670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:02.538840    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.036159    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.036181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.036184    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.038276    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:03.038354    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:03.536974    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:03.536990    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:03.536996    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:03.537000    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:03.538828    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:04.036300    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.036363    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.036391    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.038707    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:04.535718    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:04.535737    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:04.535749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:04.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:04.538366    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.036299    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.036316    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.036350    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.036354    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:05.038568    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:05.535824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:05.535837    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:05.535843    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:05.535846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:05.537780    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:06.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.036592    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.036607    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.036612    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.038642    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:06.535656    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:06.535670    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:06.535679    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:06.535682    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:06.538248    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.036322    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.036396    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.036407    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.036412    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.038943    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:07.039003    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:07.536357    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:07.536370    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:07.536379    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:07.536384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:07.538778    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.036360    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.036375    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.036381    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.036384    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.038393    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:08.536197    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:08.536266    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:08.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:08.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:08.538997    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.036883    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.036911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.036918    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.036922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.039071    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:09.039137    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:09.535649    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:09.535664    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:09.535673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:09.535677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:09.537998    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.036229    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.036241    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.036247    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.039273    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:10.536564    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:10.536575    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:10.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:10.536585    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:10.538369    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:11.036693    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.036710    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.036749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.038831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.535438    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:11.535452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:11.535461    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:11.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:11.537490    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:11.537597    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:12.035786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.035805    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.035812    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.035816    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.038145    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:12.536840    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:12.536858    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:12.536868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:12.536881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:12.538815    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.037034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.037049    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.037056    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.037059    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.038933    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.535502    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:13.535519    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:13.535593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:13.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:13.537560    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:13.537648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:14.036280    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.036300    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.036312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.036322    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.039000    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:14.535507    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:14.535527    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:14.535537    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:14.535543    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:14.538228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.036543    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.036634    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.036643    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.039762    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:15.535993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:15.536006    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:15.536012    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:15.536015    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:15.538186    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:15.538254    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:16.035582    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.035595    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.035602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.035605    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.037656    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:16.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:16.536663    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:16.536709    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:16.536713    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:16.538604    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:17.036351    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.036372    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.036393    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.039451    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:17.536542    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:17.536560    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:17.536573    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:17.536582    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:17.539454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:17.539591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:18.036512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.036578    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.036588    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.038886    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:18.535537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:18.535549    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:18.535554    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:18.535559    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:18.537559    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:19.035943    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.035968    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.035980    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.035987    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.038665    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:19.536893    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:19.536911    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:19.536920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:19.536925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:19.539416    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.036463    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.036479    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.036495    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.036500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.038824    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:20.038907    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:20.536286    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:20.536306    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:20.536313    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:20.536316    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:20.538429    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.036034    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.036055    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.038101    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:21.535690    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:21.535711    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:21.535732    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:21.535740    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:21.538264    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.036592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.036604    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.036610    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.038773    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:22.536090    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:22.536103    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:22.536109    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:22.536114    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:22.537988    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:22.538057    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:23.035526    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.035555    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.035562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.035567    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.037480    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:23.536652    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:23.536666    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:23.536673    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:23.536677    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:23.538667    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.036746    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.036766    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.036778    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.036789    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.039353    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:24.536440    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:24.536452    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:24.536459    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:24.536463    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:24.538250    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:24.538315    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:25.036622    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.036643    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.036656    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.036666    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.039764    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:25.535710    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:25.535721    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:25.535737    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:25.535742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:25.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:26.036253    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.036276    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.036338    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.036343    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.038674    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.536815    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:26.536828    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:26.536834    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:26.536838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:26.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:26.538932    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:27.035852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.035864    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.035869    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.035872    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.038024    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:27.535997    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:27.536016    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:27.536028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:27.536036    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:27.539189    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:28.035934    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.036002    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.036011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.036014    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.037996    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:28.535538    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:28.535554    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:28.535561    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:28.535563    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:28.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:29.037018    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.037032    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.037039    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.037042    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.038983    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:29.039043    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:29.535757    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:29.535769    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:29.535775    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:29.535778    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:29.537697    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:30.036529    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.036548    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.036557    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.036562    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.038833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:30.535560    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:30.535570    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:30.535576    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:30.535579    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:30.537657    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.035508    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.035520    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.035527    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.035531    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.037575    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:31.536786    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:31.536800    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:31.536806    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:31.536809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:31.538674    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:31.538731    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:32.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.035833    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.035842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.035848    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.038170    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:32.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:32.535471    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:32.535481    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:32.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:32.537802    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.037123    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.037156    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.037166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.037171    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.039252    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:33.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:33.535754    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:33.535760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:33.535763    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:33.537979    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.035638    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.035651    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.035658    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.035661    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.037722    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:34.037778    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:34.535808    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:34.535823    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:34.535831    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:34.535834    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:34.538223    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:35.036584    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.036609    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.036620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.036625    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.039788    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:35.535720    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:35.535732    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:35.535738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:35.535741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:35.537506    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:36.036439    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.036484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.036492    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.036498    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.038534    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:36.038591    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:36.535446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:36.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:36.535465    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:36.535467    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:36.537309    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:37.035737    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.035776    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.035789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.035794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.037928    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:37.535410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:37.535422    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:37.535430    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:37.535433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:37.537627    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.036658    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.036738    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.036753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.036760    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.039378    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:38.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:38.535459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:38.535474    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:38.535490    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:38.535494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:38.537817    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.036931    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.036949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.036957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.036962    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.039286    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:39.536447    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:39.536472    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:39.536487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:39.536491    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:39.538440    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:40.036354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.036378    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.036463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.036469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.535847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:40.535866    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:40.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:40.535883    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:40.538740    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:40.538822    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:41.036206    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.036229    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.036234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.038292    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:41.535741    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:41.535753    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:41.535759    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:41.535764    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:41.537837    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.036537    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.036558    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.036566    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.036570    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.039104    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:42.536474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:42.536484    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:42.536491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:42.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:42.538339    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:43.035887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.035913    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.035925    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.035931    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.038963    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:43.039028    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:43.537036    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:43.537050    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:43.537056    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:43.537059    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:43.539282    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:44.035937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.035949    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.035954    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.035958    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.037693    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:44.536399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:44.536470    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:44.536481    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:44.536485    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:44.538818    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.036937    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.036960    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.036966    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.039363    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:45.039449    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:45.535403    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:45.535415    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:45.535421    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:45.535424    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:45.537208    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:46.037001    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.037088    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.037104    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.037110    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.040342    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:46.536255    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:46.536269    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:46.536278    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:46.536284    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:46.538801    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:47.037251    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.037286    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.037297    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.037304    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.039048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.537021    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:47.537064    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:47.537071    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:47.537076    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:47.539084    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:47.539154    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:48.037354    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.037369    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.037376    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.037379    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.039646    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:48.536219    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:48.536236    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:48.536272    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:48.536276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:48.538242    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:49.035446    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.035459    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.035465    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.035469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.037563    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:49.535517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:49.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:49.535540    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:49.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:49.537433    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:50.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.036659    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.036665    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.036670    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.038735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:50.038803    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:50.535659    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:50.535678    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:50.535690    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:50.535697    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:50.538598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.036768    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.036782    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.036789    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.036794    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.038898    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:51.536592    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:51.536608    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:51.536616    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:51.536621    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:51.539087    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:52.036618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.036639    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.036652    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.036658    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.039828    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:52.039911    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:52.535902    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:52.535912    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:52.535919    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:52.535922    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:52.537950    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.036636    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.036705    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.036716    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.036721    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.039002    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:53.535455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:53.535467    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:53.535473    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:53.535476    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:53.537615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.036291    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.036325    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.036406    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.036414    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.039211    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:54.535751    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:54.535763    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:54.535769    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:54.535772    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:54.537488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:54.537606    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:55.036966    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.036982    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.036988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.038791    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:55.537260    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:55.537303    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:55.537312    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:55.537315    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:55.539579    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.036346    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.036359    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.036367    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.036370    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.038527    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:56.536015    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:56.536055    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:56.536063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:56.536068    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:56.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:56.538106    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:57.036625    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.036637    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.036646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.038481    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:57.536731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:57.536744    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:57.536749    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:57.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:57.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:58.037081    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.037160    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.037174    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.037182    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.040222    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:08:58.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:58.535453    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:58.535460    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:58.535463    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:58.537373    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:08:59.037130    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.037151    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.037161    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.037181    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.039237    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:08:59.039342    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:08:59.536756    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:08:59.536768    3827 round_trippers.go:469] Request Headers:
	I0731 10:08:59.536774    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:08:59.536777    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:08:59.538430    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:00.036701    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.036714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.036720    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.036723    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.038842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:00.535558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:00.535574    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:00.535620    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:00.535625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:00.537993    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.036274    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.036293    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.036302    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.036305    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.038700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.536455    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:01.536488    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:01.536495    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:01.536511    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:01.538672    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:01.538736    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:02.036272    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.036286    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.036291    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.036295    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:02.535392    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:02.535405    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:02.535416    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:02.535419    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:02.537336    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.036249    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.036264    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.036271    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.036276    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.038181    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:03.536990    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:03.537012    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:03.537020    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:03.537024    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:03.541054    3827 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0731 10:09:03.541125    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:04.036809    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.036887    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.036896    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.036902    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.039202    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:04.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:04.537152    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:04.537166    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:04.537904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:04.540615    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.036817    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.036832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.036838    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.036842    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.038865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:05.535412    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:05.535430    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:05.535438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:05.535446    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:05.538103    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.036140    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.036160    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.036172    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.036179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.039025    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:06.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:06.536908    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:06.536923    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:06.536930    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:06.536933    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:06.538854    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:07.035951    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.035965    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.035974    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.035979    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.038105    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:07.535618    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:07.535629    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:07.535635    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:07.535637    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:07.537552    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:08.036184    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.036212    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.036273    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.036279    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.038850    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.536040    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:08.536056    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:08.536065    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:08.536069    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:08.538402    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:08.538460    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:09.036971    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.037018    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.037025    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.037031    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.039100    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:09.535468    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:09.535480    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:09.535487    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:09.535490    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:09.537589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.035464    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.035479    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.035491    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.035506    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.037831    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.536550    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:10.536622    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:10.536632    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:10.536638    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:10.539005    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:10.539064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:11.037316    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.037399    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.037415    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.037425    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.040113    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:11.536965    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:11.536989    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:11.537033    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:11.537044    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:11.539689    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:12.036399    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.036469    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.036480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.036486    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.038399    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:12.535441    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:12.535463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:12.535475    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:12.535486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:12.539207    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:12.539333    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:13.036110    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.036220    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.036236    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.038510    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:13.535970    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:13.535990    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:13.536002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:13.536008    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:13.539197    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:14.037193    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.037263    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.037274    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.037286    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.039603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:14.535571    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:14.535586    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:14.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:14.535594    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:14.537915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.036611    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.036630    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.036642    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.036648    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.039592    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:15.039739    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:15.535565    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:15.535590    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:15.535602    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:15.535608    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:15.539127    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.035884    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.035904    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.035915    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.035919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.038938    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:16.535882    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:16.535893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:16.535900    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:16.535904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:16.537836    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:17.036590    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.036605    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.036613    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.036618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.039082    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:17.535436    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:17.535454    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:17.535466    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:17.535472    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:17.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:17.539295    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:18.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.035505    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.035509    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.037946    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:18.536869    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:18.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:18.536890    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:18.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:18.538941    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:19.035847    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.035859    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.035865    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.035868    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.037761    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:19.536117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:19.536142    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:19.536154    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:19.536160    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:19.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:19.539466    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:20.036919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.036993    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.037004    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.037009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.039230    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:20.536619    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:20.536716    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:20.536731    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:20.536738    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:20.539591    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.036024    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.036114    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.036129    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.036136    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.038666    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:21.535434    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:21.535447    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:21.535453    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:21.535457    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:21.537251    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:22.037204    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.037219    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.037228    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.037234    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.039524    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:22.039581    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:22.536431    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:22.536450    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:22.536464    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:22.536473    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:22.539233    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.035562    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.035606    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.035627    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.035634    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.037971    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:23.536650    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:23.536675    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:23.536742    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:23.536752    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:23.539879    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:24.035514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.035529    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.035535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.035544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.037431    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:24.536058    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:24.536156    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:24.536171    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:24.536179    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:24.538730    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:24.538810    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:25.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.036804    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.036814    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.036821    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.039117    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:25.535569    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:25.535587    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:25.535596    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:25.535600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:25.538114    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.035517    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.035542    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.035556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.035562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.038485    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:26.536365    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:26.536379    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:26.536386    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:26.536390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:26.538690    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:27.036639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.036652    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.036703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.036709    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.038432    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:27.038498    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:27.535539    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:27.535560    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:27.535572    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:27.535580    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:27.538434    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.035626    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.035638    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.035644    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.035647    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.037699    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:28.536177    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:28.536199    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:28.536212    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:28.536217    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:28.539218    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:29.036925    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.036950    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.036962    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.036969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.040007    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:29.040064    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:29.537194    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:29.537209    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:29.537228    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:29.537240    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:29.539598    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.036373    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.036486    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.036494    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.039302    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:30.536789    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:30.536807    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:30.536815    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:30.536820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:30.539885    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.036624    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.036635    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.036643    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.039815    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:31.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:31.536285    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:31.536295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:31.536301    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:31.538680    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:31.538744    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:32.036451    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.036463    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.036469    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.036472    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.038847    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:32.536969    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:32.537019    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:32.537032    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:32.537041    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:32.539636    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.035557    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.035573    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.035582    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.035587    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.038048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:33.535485    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:33.535509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:33.535522    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:33.535529    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:33.538268    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.035811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.035830    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.035841    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.035846    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.038580    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:34.038645    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:34.535515    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:34.535533    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:34.535543    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:34.535562    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:34.537523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.036865    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.036880    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.036887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.036890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.038894    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:35.535476    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:35.535566    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:35.535574    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:35.535579    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:35.537495    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:36.036205    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.036221    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.036227    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.036231    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.038994    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:36.039061    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:36.536105    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:36.536117    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:36.536124    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:36.536127    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:36.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.036134    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:37.536082    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:37.536101    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:37.536110    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:37.536114    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:37.538459    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.035493    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.035509    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.035517    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.035524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.037791    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:38.535613    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:38.535632    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:38.535645    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:38.535668    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:38.539185    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:38.539281    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:39.036660    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.036682    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.036693    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.036700    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.039452    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:39.535986    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:39.536000    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:39.536007    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:39.536011    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:39.537968    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:40.036939    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.037010    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.037021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.037026    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.039435    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:40.536149    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:40.536171    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:40.536233    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:40.536239    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:40.538338    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.036629    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.036641    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.036647    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.036651    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.038835    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:41.038897    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:41.536269    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:41.536280    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:41.536287    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:41.536290    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:41.538277    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:42.036495    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.036511    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.036520    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.036524    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.038560    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:42.537182    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:42.537201    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:42.537210    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:42.537215    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:42.539833    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.035857    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.035874    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.035881    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.035891    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.038530    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.536377    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:43.536465    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:43.536480    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:43.536488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:43.539159    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:43.539217    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:44.036979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.037065    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.037081    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.037089    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.039312    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:44.536993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:44.537011    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:44.537018    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:44.537063    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:44.539131    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.036929    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.036952    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.037050    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.037064    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.039700    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.537089    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:45.537112    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:45.537123    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:45.537132    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:45.539940    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:45.540011    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:46.036811    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.036857    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.036868    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.036882    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.039540    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:46.535831    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:46.535845    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:46.535852    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:46.535856    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:46.538387    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:47.036117    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.036128    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.036134    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.036137    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.037871    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:47.536504    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:47.536553    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:47.536564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:47.536568    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:47.538867    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:48.036960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.036980    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.036992    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.036998    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.040512    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:48.041066    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:48.535514    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:48.535532    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:48.535542    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:48.535547    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:48.537881    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.036112    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.036124    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.036130    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.036133    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.038899    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:49.536876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:49.536893    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:49.536899    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:49.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:49.538675    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.037190    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.037204    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.037213    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.037216    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.039015    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:50.536824    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:50.536920    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:50.536935    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:50.536942    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:50.539735    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:50.539808    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:51.035683    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.035696    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.035702    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.035706    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.038883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:51.536861    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:51.536882    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:51.536894    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:51.536901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:51.539779    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:52.035474    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.035485    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.035493    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.035499    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.037401    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:52.536642    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:52.536661    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:52.536669    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:52.536674    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:52.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.036427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.036471    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.036482    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.036487    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.038951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:53.039010    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:53.535427    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:53.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:53.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:53.535450    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:53.537257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:54.036806    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.036828    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.036832    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.039021    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:54.535805    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:54.535897    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:54.535912    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:54.535919    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:54.538990    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:09:55.036521    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.036539    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.036546    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.036549    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.038766    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.536647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:55.536714    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:55.536723    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:55.536727    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:55.539055    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:55.539163    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:56.035522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.035534    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.035541    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.035545    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:56.535916    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:56.535934    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:56.535943    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:56.535949    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:56.538329    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:57.036391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.036406    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.036413    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.036417    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.038267    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:57.535390    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:57.535439    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:57.535447    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:57.535452    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:57.537243    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.036752    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.036778    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.036805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.036809    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.038620    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:58.038682    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:09:58.536471    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:58.536516    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:58.536526    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:58.536532    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:58.538643    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:09:59.035837    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.035851    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.035858    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.035861    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.037705    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:09:59.536730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:09:59.536832    3827 round_trippers.go:469] Request Headers:
	I0731 10:09:59.536848    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:09:59.536854    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:09:59.539682    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.035558    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.035587    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.035600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.035612    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.037523    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:00.535512    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:00.535528    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:00.535534    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:00.535537    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:00.537603    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:00.537667    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:01.036888    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.036943    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.036951    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.036955    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.038774    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:01.535488    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:01.535504    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:01.535513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:01.535517    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:01.538017    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.036031    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.036045    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.036051    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.036054    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.037488    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:02.537218    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:02.537285    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:02.537295    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:02.537300    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:02.539559    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:02.539701    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:03.036241    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.036256    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.036263    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.036269    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.037763    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:03.536877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:03.536892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:03.536901    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:03.536904    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:03.539168    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:04.035721    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.035733    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.035739    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.035742    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.037607    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:04.535679    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:04.535694    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:04.535703    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:04.535707    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:04.537920    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:05.037180    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.037195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.037201    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.037205    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:05.038947    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:05.536233    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:05.536248    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:05.536254    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:05.536258    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:05.538191    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.036830    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.036845    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.036852    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.036856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.038427    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:06.536722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:06.536735    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:06.536741    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:06.536753    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:06.538631    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.036171    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.036186    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.036192    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.036195    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.038330    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:07.536466    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:07.536481    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:07.536488    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:07.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:07.538446    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:07.538510    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:08.036787    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.036821    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.036832    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.036853    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.039084    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:08.535567    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:08.535582    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:08.535589    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:08.535593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:08.537711    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.035421    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.035432    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.035438    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.035442    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.037921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.535887    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:09.535904    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:09.535913    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:09.535943    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:09.538516    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:09.538592    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:10.035458    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.035469    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.035474    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.035477    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.038652    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:10.535979    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:10.535992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:10.535998    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:10.536002    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:10.537981    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:11.035819    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.035886    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.035897    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.035901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.038043    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:11.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:11.535487    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:11.535494    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:11.535497    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:11.537395    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:12.036578    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.036591    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.036598    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.036601    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.038621    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:12.038676    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:12.536927    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:12.536941    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:12.536947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:12.536952    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:12.539050    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:13.036386    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.036399    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.036428    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.036433    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.038022    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:13.536356    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:13.536376    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:13.536403    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:13.536406    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:13.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.035960    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.035973    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.035979    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.035983    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.037566    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:14.535889    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:14.535909    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:14.535920    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:14.535926    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:14.538796    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:14.538873    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:15.037263    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.037278    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.037284    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.037291    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.038934    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:15.535930    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:15.535949    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:15.535957    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:15.535961    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:15.538412    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:16.035774    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.035790    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.035798    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.035803    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.037617    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:16.536338    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:16.536352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:16.536359    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:16.536362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:16.538545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.036602    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.036625    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.039042    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:17.039098    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:17.535886    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:17.535901    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:17.535907    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:17.535910    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:17.538060    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:18.036894    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.036938    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.036947    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.036950    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.038702    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:18.535556    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:18.535571    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:18.535580    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:18.535586    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:18.537620    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.035993    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.036009    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.036017    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.036021    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.038160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:19.536410    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:19.536433    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:19.536444    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:19.536452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:19.539613    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:19.539694    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:20.035430    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.035445    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.035456    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.035466    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.037008    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:20.536812    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:20.536836    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:20.536849    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:20.536855    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:20.539846    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.035746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.035755    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.035761    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.037893    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:21.536119    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:21.536158    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:21.536173    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:21.536181    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:21.538305    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.035742    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.035796    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.038072    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:22.038175    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:22.536977    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:22.536992    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:22.536999    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:22.537002    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:22.539319    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:23.036522    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.036538    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.036544    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.036547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.038326    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:23.537176    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:23.537194    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:23.537202    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:23.537208    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:23.539537    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:24.036672    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.036686    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.036692    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.036696    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.038290    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:24.038347    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:24.536490    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:24.536508    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:24.536519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:24.536525    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:24.539462    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:25.036309    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.036323    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.036329    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.036332    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.038173    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:25.535523    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:25.535539    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:25.535547    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:25.535552    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:25.538454    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:26.035663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.035681    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.035719    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.035722    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.037593    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.536821    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:26.536884    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:26.536893    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:26.536896    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:26.538841    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:26.538912    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:27.036722    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.036734    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.036740    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.036743    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.038648    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:27.537059    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:27.537079    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:27.537111    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:27.537116    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:27.539595    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:28.035398    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.035411    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.035417    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.035421    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.037116    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:28.536047    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:28.536115    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:28.536125    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:28.536133    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:28.538589    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.036033    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.036048    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.036055    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.036058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.038794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:29.038860    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:29.536173    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:29.536187    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:29.536193    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:29.536198    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:29.538161    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:30.036950    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.037050    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.037065    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.037072    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.039996    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:30.536407    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:30.536424    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:30.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:30.536492    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:30.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.036484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.036581    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.036593    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.036600    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.039439    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:31.039521    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:31.535848    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:31.535863    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:31.535872    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:31.535878    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:31.538048    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.036070    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.036083    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.036092    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.036097    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.038358    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:32.535559    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:32.535583    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:32.535597    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:32.535604    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:32.538962    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:33.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.035880    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.035887    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.035890    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.038234    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:33.536345    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:33.536363    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:33.536408    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:33.536413    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:33.538408    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:33.538470    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:34.035876    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.035899    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.035911    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.035917    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.038813    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:34.535532    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:34.535555    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:34.535599    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:34.535611    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:34.538619    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.036525    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.036545    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.036557    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.036565    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.039453    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.536317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:35.536338    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:35.536346    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:35.536351    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:35.538546    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:35.538604    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:36.035614    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.035632    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.035642    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.035648    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.037951    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:36.535593    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:36.535610    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:36.535620    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:36.535627    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:36.538091    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.035952    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.035972    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.035984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.035992    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.039078    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:37.536397    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:37.536416    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:37.536425    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:37.536431    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:37.538652    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:37.538721    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:38.036647    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.036688    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.036697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.036702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.038657    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:38.535391    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:38.535458    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:38.535469    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:38.535474    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:38.537747    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:39.036877    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.036896    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.036908    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.036916    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.039937    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.537361    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:39.537463    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:39.537475    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:39.537480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:39.540492    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:39.540575    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:40.035736    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.035759    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.035797    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.035817    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.038896    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:40.536124    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:40.536136    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:40.536142    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:40.536147    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:40.538082    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:41.036456    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.036502    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.036513    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.036519    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.038631    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:41.535516    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:41.535529    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:41.535535    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:41.535539    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:41.537637    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.035758    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.035779    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.035790    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.035795    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.038565    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:42.038648    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:42.536775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:42.536801    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:42.536856    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:42.536867    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:42.539883    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:43.036733    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.036747    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.036754    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.036758    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.038792    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:43.536704    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:43.536719    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:43.536725    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:43.536730    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:43.538830    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.037317    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.037342    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.037351    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.037356    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.040355    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:44.040430    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:44.537337    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:44.537352    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:44.537358    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:44.537362    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:44.539426    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.036153    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.036174    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.036187    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.036193    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.039178    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:45.535572    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:45.535584    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:45.535591    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:45.535596    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:45.537420    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:46.037146    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.037161    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.037168    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.037199    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.039539    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.536761    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:46.536842    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:46.536857    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:46.536863    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:46.539600    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:46.539683    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:47.037209    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.037228    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.037237    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.037243    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.039381    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:47.536097    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:47.536127    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:47.536138    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:47.536143    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:47.540045    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:48.035580    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.035598    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.035610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.035618    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.037609    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:48.535945    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:48.535960    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:48.535966    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:48.535969    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:48.537852    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:49.036904    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.036928    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.036941    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.036946    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.039794    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:49.039868    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:49.536635    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:49.536649    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:49.536699    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:49.536704    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:49.538637    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.035478    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.035491    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.035497    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.035500    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.037398    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:50.536222    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:50.536321    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:50.536335    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:50.536342    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:50.539228    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.035730    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.035748    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.035813    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.035820    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.037953    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.536457    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:51.536471    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:51.536480    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:51.536485    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:51.538865    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:51.538935    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:52.036481    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.036503    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.036583    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.036593    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.039545    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:52.536583    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:52.536620    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:52.536636    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:52.536646    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:52.539115    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:53.037214    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.037226    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.037256    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.037262    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.039257    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:53.535880    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:53.535892    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:53.535898    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:53.535901    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:53.538097    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.035680    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.035691    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.035697    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.035702    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.037758    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:54.037819    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:54.536181    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:54.536195    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:54.536250    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:54.536256    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:54.538069    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:55.036750    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.036858    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.036874    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.036881    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.040140    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:10:55.535731    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:55.535746    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:55.535752    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:55.535755    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:55.537710    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:56.037367    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.037382    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.037392    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.037396    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.039716    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:56.039828    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:56.535738    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:56.535750    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:56.535757    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:56.535760    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:56.537553    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:10:57.036797    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.036852    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.036859    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.036862    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.038921    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:57.535419    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:57.535437    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:57.535446    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:57.535452    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:57.537842    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.035459    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.035475    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.035484    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.035488    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.037963    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.536607    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:58.536625    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:58.536640    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:58.536653    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:58.539173    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:58.539233    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:10:59.035868    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.035890    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.035902    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.035912    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.038872    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:10:59.535411    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:10:59.535426    3827 round_trippers.go:469] Request Headers:
	I0731 10:10:59.535432    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:10:59.535434    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:10:59.537913    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.036663    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.036679    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.036686    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.036690    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.038915    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:00.536586    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:00.536602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:00.536610    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:00.536615    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:00.538823    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.037017    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.037041    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.037053    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.037058    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.039885    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:01.039956    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:01.537010    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:01.537022    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:01.537028    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:01.537032    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:01.538870    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:02.036801    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.036819    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.036827    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.036831    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.039277    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:02.535479    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:02.535495    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:02.535501    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:02.535505    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:02.537168    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:03.037023    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.037069    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.037079    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.037084    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.039521    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.536060    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:03.536073    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:03.536079    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:03.536083    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:03.538949    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:03.539021    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:04.036364    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.036379    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.036385    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.036390    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.038419    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:04.536237    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:04.536251    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:04.536260    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:04.536264    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:04.538409    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:05.035688    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.035701    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.035708    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.035712    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.037474    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:05.535639    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:05.535661    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:05.535671    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:05.535676    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:05.538235    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.036540    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.036554    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.036560    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.036564    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.039139    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:06.039201    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:06.536852    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:06.536867    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:06.536875    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:06.536879    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:06.539160    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:07.037400    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.037412    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.037419    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.037422    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.039316    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:07.535475    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:07.535496    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:07.535507    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:07.535514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:07.538665    3827 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0731 10:11:08.035588    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.035602    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.035609    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.035614    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.037450    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.535606    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:08.535617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:08.535624    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:08.535628    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:08.537643    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:08.537700    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:09.036533    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.036549    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.036556    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.036560    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.038511    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:09.536726    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:09.536794    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:09.536805    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:09.536810    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:09.539347    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.036599    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.036617    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.036626    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.036630    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.038891    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.535919    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:10.535991    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:10.536003    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:10.536009    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:10.538198    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:10.538256    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:11.035775    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.035789    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.035795    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.035799    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.037602    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:11.535963    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:11.535977    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:11.535984    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:11.535988    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:11.538020    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.035422    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.035494    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.035509    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.035514    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.037902    3827 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0731 10:11:12.536484    3827 round_trippers.go:463] GET https://192.169.0.5:8443/api/v1/nodes/ha-393000-m04
	I0731 10:11:12.536500    3827 round_trippers.go:469] Request Headers:
	I0731 10:11:12.536506    3827 round_trippers.go:473]     Accept: application/json, */*
	I0731 10:11:12.536510    3827 round_trippers.go:473]     User-Agent: minikube-darwin-amd64/v0.0.0 (darwin/amd64) kubernetes/$Format
	I0731 10:11:12.538333    3827 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0731 10:11:12.538392    3827 node_ready.go:53] error getting node "ha-393000-m04": nodes "ha-393000-m04" not found
	I0731 10:11:12.538407    3827 node_ready.go:38] duration metric: took 4m0.003142979s for node "ha-393000-m04" to be "Ready" ...
	I0731 10:11:12.560167    3827 out.go:177] 
	W0731 10:11:12.580908    3827 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0731 10:11:12.580926    3827 out.go:239] * 
	W0731 10:11:12.582125    3827 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0731 10:11:12.680641    3827 out.go:177] 
	
	
	==> Docker <==
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.914226423Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928630776Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928700349Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928854780Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.929029367Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.930900389Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.930985608Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.931085246Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.931220258Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.928429805Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.933866106Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.933878374Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.934267390Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953115079Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953269656Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953688559Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:07 ha-393000 dockerd[1180]: time="2024-07-31T17:06:07.953968281Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:38 ha-393000 dockerd[1174]: time="2024-07-31T17:06:38.259320248Z" level=info msg="ignoring event" container=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259503796Z" level=info msg="shim disconnected" id=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 namespace=moby
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259817357Z" level=warning msg="cleaning up after shim disconnected" id=c2de84de71d0d78a5c297972699a04cfcf1df74181e62f75c55ced16c7edf381 namespace=moby
	Jul 31 17:06:38 ha-393000 dockerd[1180]: time="2024-07-31T17:06:38.259827803Z" level=info msg="cleaning up dead shim" namespace=moby
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937784723Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937892479Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.937935988Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 31 17:06:50 ha-393000 dockerd[1180]: time="2024-07-31T17:06:50.938076078Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	
	
	==> container status <==
	CONTAINER           IMAGE                                                                                                 CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	21ff27483d07f       6e38f40d628db                                                                                         5 minutes ago       Running             storage-provisioner       2                   31c959cec2158       storage-provisioner
	7500c837dfe73       8c811b4aec35f                                                                                         6 minutes ago       Running             busybox                   1                   f5579bdb56284       busybox-fc5497c4f-b94zr
	492e11c732d18       cbb01a7bd410d                                                                                         6 minutes ago       Running             coredns                   1                   22a2f7cb99560       coredns-7db6d8ff4d-wvqjl
	26d835568c733       cbb01a7bd410d                                                                                         6 minutes ago       Running             coredns                   1                   8336e3fbaa274       coredns-7db6d8ff4d-5m8st
	193af4895baf9       6f1d07c71fa0f                                                                                         6 minutes ago       Running             kindnet-cni               1                   304fa6a12c82b       kindnet-hjm7c
	4f56054bbee16       55bb025d2cfa5                                                                                         6 minutes ago       Running             kube-proxy                1                   7e638ed37b5ca       kube-proxy-zc52f
	c2de84de71d0d       6e38f40d628db                                                                                         6 minutes ago       Exited              storage-provisioner       1                   31c959cec2158       storage-provisioner
	42b34888f43b4       76932a3b37d7e                                                                                         6 minutes ago       Running             kube-controller-manager   6                   dd7a38b9a9134       kube-controller-manager-ha-393000
	bf0af6a864492       38af8ddebf499                                                                                         7 minutes ago       Running             kube-vip                  1                   7ae512ce66d9e       kube-vip-ha-393000
	0a6a6d756b8d8       76932a3b37d7e                                                                                         7 minutes ago       Exited              kube-controller-manager   5                   dd7a38b9a9134       kube-controller-manager-ha-393000
	a34d35a3b612b       3edc18e7b7672                                                                                         7 minutes ago       Running             kube-scheduler            2                   b550834f339ce       kube-scheduler-ha-393000
	488f4fddc126e       3861cfcd7c04c                                                                                         7 minutes ago       Running             etcd                      2                   35bc88d55a5f9       etcd-ha-393000
	7e0d32286913b       1f6d574d502f3                                                                                         7 minutes ago       Running             kube-apiserver            5                   913ebe1d27d36       kube-apiserver-ha-393000
	aec44315311a1       1f6d574d502f3                                                                                         9 minutes ago       Exited              kube-apiserver            4                   194073f1c5ac9       kube-apiserver-ha-393000
	86018b08bbaa1       3861cfcd7c04c                                                                                         11 minutes ago      Exited              etcd                      1                   ba75e4f4299bf       etcd-ha-393000
	5fcb6f7d8ab78       38af8ddebf499                                                                                         11 minutes ago      Exited              kube-vip                  0                   e6198932cc027       kube-vip-ha-393000
	d088fefe5f8e3       3edc18e7b7672                                                                                         11 minutes ago      Exited              kube-scheduler            1                   f04a7ecd568d2       kube-scheduler-ha-393000
	dfd6c1abd9d81       gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12   15 minutes ago      Exited              busybox                   0                   e9ce137a2245c       busybox-fc5497c4f-b94zr
	4c466ed60807a       cbb01a7bd410d                                                                                         18 minutes ago      Exited              coredns                   0                   480020f5f9c0c       coredns-7db6d8ff4d-5m8st
	feda36fb8a038       cbb01a7bd410d                                                                                         18 minutes ago      Exited              coredns                   0                   c2a288a20831d       coredns-7db6d8ff4d-wvqjl
	8a97800bb6652       kindest/kindnetd@sha256:da8ad203ec15a72c313015e5609db44bfad7c95d8ce63e87ff97c66363b5680a              18 minutes ago      Exited              kindnet-cni               0                   5a6298a01ce2c       kindnet-hjm7c
	d9a1703e5ccc7       55bb025d2cfa5                                                                                         18 minutes ago      Exited              kube-proxy                0                   4811de53b4dfe       kube-proxy-zc52f
	
	
	==> coredns [26d835568c73] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45868 - 37816 "HINFO IN 2903702352377705943.3393804209116430399. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.009308312s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[336879232]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.192) (total time: 30001ms):
	Trace[336879232]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.193)
	Trace[336879232]: [30.001669762s] [30.001669762s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[792684680]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.192) (total time: 30002ms):
	Trace[792684680]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.193)
	Trace[792684680]: [30.002844954s] [30.002844954s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[252017809]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.190) (total time: 30004ms):
	Trace[252017809]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.192)
	Trace[252017809]: [30.004125023s] [30.004125023s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [492e11c732d1] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 257e111468ef6f1e36f10df061303186c353cd0e51aed8f50f4e4fd21cec02687aef97084fe1f82262f5cee88179d311670a6ae21ae185759728216fc264125f
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:50203 - 38178 "HINFO IN 6515882504773672893.3508195612419770899. udp 57 false 512" NXDOMAIN qr,rd,ra 132 0.008964582s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1731745039]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.202) (total time: 30000ms):
	Trace[1731745039]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.203)
	Trace[1731745039]: [30.000463s] [30.000463s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1820975691]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.203) (total time: 30000ms):
	Trace[1820975691]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:06:38.203)
	Trace[1820975691]: [30.00019609s] [30.00019609s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[58591392]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (31-Jul-2024 17:06:08.202) (total time: 30001ms):
	Trace[58591392]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:06:38.203)
	Trace[58591392]: [30.001286385s] [30.001286385s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [4c466ed60807] <==
	[INFO] 10.244.2.2:45681 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000099736s
	[INFO] 10.244.2.2:34074 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000671139s
	[INFO] 10.244.2.2:40529 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000055047s
	[INFO] 10.244.2.2:39487 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000049101s
	[INFO] 10.244.1.2:51329 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000089494s
	[INFO] 10.244.1.2:53070 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000059551s
	[INFO] 10.244.1.2:59895 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000053831s
	[INFO] 10.244.1.2:59390 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000055273s
	[INFO] 10.244.1.2:33340 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,aa,rd,ra 111 0.000051484s
	[INFO] 10.244.1.2:40993 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00011545s
	[INFO] 10.244.0.4:53141 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000061683s
	[INFO] 10.244.0.4:52919 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000069047s
	[INFO] 10.244.2.2:35365 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00006979s
	[INFO] 10.244.2.2:36078 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000091578s
	[INFO] 10.244.1.2:45671 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000102008s
	[INFO] 10.244.1.2:35347 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000068683s
	[INFO] 10.244.0.4:57888 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000102469s
	[INFO] 10.244.0.4:57504 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000906001s
	[INFO] 10.244.0.4:33428 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.00008626s
	[INFO] 10.244.2.2:58922 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000083751s
	[INFO] 10.244.2.2:35072 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000099024s
	[INFO] 10.244.1.2:55592 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109744s
	[INFO] 10.244.1.2:56533 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000078819s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> coredns [feda36fb8a03] <==
	[INFO] 10.244.2.2:46068 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000274895s
	[INFO] 10.244.2.2:47958 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.00061792s
	[INFO] 10.244.2.2:45413 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 140 0.000336332s
	[INFO] 10.244.1.2:34472 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 140 0.000499864s
	[INFO] 10.244.1.2:33765 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,aa,rd,ra 140 0.000055847s
	[INFO] 10.244.0.4:55503 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000107517s
	[INFO] 10.244.0.4:60397 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 111 0.000957382s
	[INFO] 10.244.0.4:39036 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000103899s
	[INFO] 10.244.2.2:50310 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076585s
	[INFO] 10.244.2.2:53000 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.00004745s
	[INFO] 10.244.1.2:32961 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000130818s
	[INFO] 10.244.1.2:54647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000131808s
	[INFO] 10.244.0.4:59711 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000037764s
	[INFO] 10.244.0.4:58639 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000055716s
	[INFO] 10.244.2.2:60323 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000109757s
	[INFO] 10.244.2.2:33304 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000063443s
	[INFO] 10.244.1.2:33908 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000119716s
	[INFO] 10.244.1.2:55347 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000098304s
	[INFO] 10.244.0.4:37343 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000059699s
	[INFO] 10.244.2.2:36477 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000064775s
	[INFO] 10.244.2.2:59513 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000111351s
	[INFO] 10.244.1.2:57676 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000095936s
	[INFO] 10.244.1.2:55460 - 5 "PTR IN 1.0.169.192.in-addr.arpa. udp 42 false 512" NOERROR qr,aa,rd 102 0.000101902s
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	
	==> describe nodes <==
	Name:               ha-393000
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_31T09_53_53_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:53:48 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:12:38 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 16:53:47 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:11:07 +0000   Wed, 31 Jul 2024 17:06:00 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.5
	  Hostname:    ha-393000
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 9e10f5eb61854acbaf6547934383ee12
	  System UUID:                2cfe48dd-0000-0000-9b98-537ad9823a95
	  Boot ID:                    b9343713-c701-4963-b11c-cdefca0b39ab
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-b94zr              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 coredns-7db6d8ff4d-5m8st             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     18m
	  kube-system                 coredns-7db6d8ff4d-wvqjl             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     18m
	  kube-system                 etcd-ha-393000                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         18m
	  kube-system                 kindnet-hjm7c                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      18m
	  kube-system                 kube-apiserver-ha-393000             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kube-controller-manager-ha-393000    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kube-proxy-zc52f                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kube-scheduler-ha-393000             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kube-vip-ha-393000                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m40s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 6m38s                  kube-proxy       
	  Normal  Starting                 18m                    kube-proxy       
	  Normal  NodeHasSufficientPID     18m                    kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  Starting                 18m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  18m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  18m                    kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    18m                    kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  RegisteredNode           18m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  NodeReady                18m                    kubelet          Node ha-393000 status is now: NodeReady
	  Normal  RegisteredNode           17m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           16m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           13m                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  Starting                 7m24s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  7m24s (x8 over 7m24s)  kubelet          Node ha-393000 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    7m24s (x8 over 7m24s)  kubelet          Node ha-393000 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     7m24s (x7 over 7m24s)  kubelet          Node ha-393000 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  7m24s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           6m42s                  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           6m34s                  node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           6m5s                   node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	  Normal  RegisteredNode           20s                    node-controller  Node ha-393000 event: Registered Node ha-393000 in Controller
	
	
	Name:               ha-393000-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_55_08_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:55:06 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m02
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:12:37 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:10:55 +0000   Wed, 31 Jul 2024 16:55:27 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.6
	  Hostname:    ha-393000-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 83c6a2bd65fe41eb8d2ed449f1d84121
	  System UUID:                7863443c-0000-0000-8e8d-bbd47bc06547
	  Boot ID:                    aad47d4e-f7f0-4bd8-87b6-edfb69496407
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-zln22                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 etcd-ha-393000-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         17m
	  kube-system                 kindnet-lcwbs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      17m
	  kube-system                 kube-apiserver-ha-393000-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-controller-manager-ha-393000-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-proxy-cf577                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-scheduler-ha-393000-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-vip-ha-393000-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                  From             Message
	  ----     ------                   ----                 ----             -------
	  Normal   Starting                 6m55s                kube-proxy       
	  Normal   Starting                 13m                  kube-proxy       
	  Normal   Starting                 17m                  kube-proxy       
	  Normal   NodeHasSufficientPID     17m (x7 over 17m)    kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  17m                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  17m (x8 over 17m)    kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    17m (x8 over 17m)    kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           17m                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           17m                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           16m                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Warning  Rebooted                 13m                  kubelet          Node ha-393000-m02 has been rebooted, boot id: febe9487-cc37-4f76-a943-4c3bd5898a28
	  Normal   NodeHasSufficientPID     13m                  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeHasNoDiskPressure    13m                  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   Starting                 13m                  kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  13m                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  13m                  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   RegisteredNode           13m                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   Starting                 7m5s                 kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  7m5s (x8 over 7m5s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    7m5s (x8 over 7m5s)  kubelet          Node ha-393000-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     7m5s (x7 over 7m5s)  kubelet          Node ha-393000-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  7m5s                 kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           6m42s                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           6m34s                node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           6m5s                 node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	  Normal   RegisteredNode           20s                  node-controller  Node ha-393000-m02 event: Registered Node ha-393000-m02 in Controller
	
	
	Name:               ha-393000-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T09_56_20_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 16:56:18 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:12:42 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:11:31 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:11:31 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:11:31 +0000   Wed, 31 Jul 2024 16:56:18 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:11:31 +0000   Wed, 31 Jul 2024 16:56:40 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.7
	  Hostname:    ha-393000-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 6bd67d455470412d948a97ba6f8b8a9a
	  System UUID:                451d42a6-0000-0000-8ccb-b8851dda0594
	  Boot ID:                    0d534f8f-f62b-4786-808f-39cb1c1bf961
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-n8d7h                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 etcd-ha-393000-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         16m
	  kube-system                 kindnet-s2pv6                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      16m
	  kube-system                 kube-apiserver-ha-393000-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-controller-manager-ha-393000-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-proxy-cr9pg                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-scheduler-ha-393000-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-vip-ha-393000-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 6m17s              kube-proxy       
	  Normal   Starting                 16m                kube-proxy       
	  Normal   NodeAllocatableEnforced  16m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  16m (x8 over 16m)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    16m (x8 over 16m)  kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     16m (x7 over 16m)  kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           16m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           13m                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           6m42s              node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           6m34s              node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   Starting                 6m21s              kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  6m21s              kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  6m21s              kubelet          Node ha-393000-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    6m21s              kubelet          Node ha-393000-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     6m21s              kubelet          Node ha-393000-m03 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 6m21s              kubelet          Node ha-393000-m03 has been rebooted, boot id: 0d534f8f-f62b-4786-808f-39cb1c1bf961
	  Normal   RegisteredNode           6m5s               node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	  Normal   RegisteredNode           20s                node-controller  Node ha-393000-m03 event: Registered Node ha-393000-m03 in Controller
	
	
	Name:               ha-393000-m05
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-393000-m05
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=1d737dad7efa60c56d30434fcd857dd3b14c91d9
	                    minikube.k8s.io/name=ha-393000
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_31T10_12_12_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 31 Jul 2024 17:12:09 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-393000-m05
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 31 Jul 2024 17:12:40 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 31 Jul 2024 17:12:39 +0000   Wed, 31 Jul 2024 17:12:09 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 31 Jul 2024 17:12:39 +0000   Wed, 31 Jul 2024 17:12:09 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 31 Jul 2024 17:12:39 +0000   Wed, 31 Jul 2024 17:12:09 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 31 Jul 2024 17:12:39 +0000   Wed, 31 Jul 2024 17:12:30 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.169.0.9
	  Hostname:    ha-393000-m05
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164336Ki
	  pods:               110
	System Info:
	  Machine ID:                 bc282f6d719b4a90b740772af576c327
	  System UUID:                9f1942b2-0000-0000-9bc9-ed8a9a6bda42
	  Boot ID:                    eeb1a68f-a657-4b5c-998f-777ed2c95fa7
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://27.1.1
	  Kubelet Version:            v1.30.3
	  Kube-Proxy Version:         v1.30.3
	PodCIDR:                      10.244.3.0/24
	PodCIDRs:                     10.244.3.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  kube-system                 etcd-ha-393000-m05                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         35s
	  kube-system                 kindnet-2vcxs                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      37s
	  kube-system                 kube-apiserver-ha-393000-m05             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         36s
	  kube-system                 kube-controller-manager-ha-393000-m05    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         36s
	  kube-system                 kube-proxy-8vlbk                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         34s
	  kube-system                 kube-scheduler-ha-393000-m05             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         36s
	  kube-system                 kube-vip-ha-393000-m05                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         33s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 32s                kube-proxy       
	  Normal  NodeHasSufficientMemory  37s (x8 over 37s)  kubelet          Node ha-393000-m05 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    37s (x8 over 37s)  kubelet          Node ha-393000-m05 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     37s (x7 over 37s)  kubelet          Node ha-393000-m05 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  37s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           35s                node-controller  Node ha-393000-m05 event: Registered Node ha-393000-m05 in Controller
	  Normal  RegisteredNode           34s                node-controller  Node ha-393000-m05 event: Registered Node ha-393000-m05 in Controller
	  Normal  RegisteredNode           32s                node-controller  Node ha-393000-m05 event: Registered Node ha-393000-m05 in Controller
	  Normal  RegisteredNode           20s                node-controller  Node ha-393000-m05 event: Registered Node ha-393000-m05 in Controller
	
	
	==> dmesg <==
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.035849] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20200925/tbprint-173)
	[  +0.008140] RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible!
	[  +5.683009] ACPI Error: Could not enable RealTimeClock event (20200925/evxfevnt-182)
	[  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.007123] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.689234] systemd-fstab-generator[127]: Ignoring "noauto" option for root device
	[  +2.257015] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000008] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +2.569890] systemd-fstab-generator[473]: Ignoring "noauto" option for root device
	[  +0.101117] systemd-fstab-generator[485]: Ignoring "noauto" option for root device
	[  +1.260537] kauditd_printk_skb: 42 callbacks suppressed
	[  +0.721842] systemd-fstab-generator[1103]: Ignoring "noauto" option for root device
	[  +0.244917] systemd-fstab-generator[1140]: Ignoring "noauto" option for root device
	[  +0.105223] systemd-fstab-generator[1152]: Ignoring "noauto" option for root device
	[  +0.108861] systemd-fstab-generator[1166]: Ignoring "noauto" option for root device
	[  +2.483787] systemd-fstab-generator[1382]: Ignoring "noauto" option for root device
	[  +0.096628] systemd-fstab-generator[1394]: Ignoring "noauto" option for root device
	[  +0.110449] systemd-fstab-generator[1406]: Ignoring "noauto" option for root device
	[  +0.128159] systemd-fstab-generator[1422]: Ignoring "noauto" option for root device
	[  +0.446597] systemd-fstab-generator[1585]: Ignoring "noauto" option for root device
	[  +6.854766] kauditd_printk_skb: 271 callbacks suppressed
	[ +21.847998] kauditd_printk_skb: 40 callbacks suppressed
	[Jul31 17:06] kauditd_printk_skb: 80 callbacks suppressed
	
	
	==> etcd [488f4fddc126] <==
	{"level":"info","ts":"2024-07-31T17:06:27.348297Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"cc1c22e219d8e152"}
	{"level":"info","ts":"2024-07-31T17:12:09.48261Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(2035864250365333051 13314548521573537860 14707668837576794450) learners=(3787017823365283298)"}
	{"level":"info","ts":"2024-07-31T17:12:09.48294Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844","added-peer-id":"348e30a557c2bde2","added-peer-peer-urls":["https://192.169.0.9:2380"]}
	{"level":"info","ts":"2024-07-31T17:12:09.482988Z","caller":"rafthttp/peer.go:133","msg":"starting remote peer","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.483013Z","caller":"rafthttp/pipeline.go:72","msg":"started HTTP pipelining with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.484207Z","caller":"rafthttp/peer.go:137","msg":"started remote peer","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.484274Z","caller":"rafthttp/transport.go:317","msg":"added remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2","remote-peer-urls":["https://192.169.0.9:2380"]}
	{"level":"info","ts":"2024-07-31T17:12:09.484339Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.484379Z","caller":"rafthttp/stream.go:169","msg":"started stream writer with remote peer","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.484392Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:09.48461Z","caller":"rafthttp/stream.go:395","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"warn","ts":"2024-07-31T17:12:09.537095Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"348e30a557c2bde2","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"warn","ts":"2024-07-31T17:12:10.027125Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"348e30a557c2bde2","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"warn","ts":"2024-07-31T17:12:10.535309Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"348e30a557c2bde2","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-07-31T17:12:10.924451Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:10.92592Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"348e30a557c2bde2","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-07-31T17:12:10.925957Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:10.926147Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:10.927284Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"b8c6c7563d17d844","to":"348e30a557c2bde2","stream-type":"stream Message"}
	{"level":"info","ts":"2024-07-31T17:12:10.92746Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"info","ts":"2024-07-31T17:12:10.963168Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"b8c6c7563d17d844","remote-peer-id":"348e30a557c2bde2"}
	{"level":"warn","ts":"2024-07-31T17:12:11.526308Z","caller":"etcdhttp/peer.go:150","msg":"failed to promote a member","member-id":"348e30a557c2bde2","error":"etcdserver: can only promote a learner member which is in sync with leader"}
	{"level":"info","ts":"2024-07-31T17:12:12.027619Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 switched to configuration voters=(2035864250365333051 3787017823365283298 13314548521573537860 14707668837576794450)"}
	{"level":"info","ts":"2024-07-31T17:12:12.028082Z","caller":"membership/cluster.go:535","msg":"promote member","cluster-id":"b73189effde9bc63","local-member-id":"b8c6c7563d17d844"}
	{"level":"info","ts":"2024-07-31T17:12:12.028265Z","caller":"etcdserver/server.go:1946","msg":"applied a configuration change through raft","local-member-id":"b8c6c7563d17d844","raft-conf-change":"ConfChangeAddNode","raft-conf-change-node-id":"348e30a557c2bde2"}
	
	
	==> etcd [86018b08bbaa] <==
	{"level":"info","ts":"2024-07-31T17:04:54.706821Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:54.70684Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:54.70685Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:04:55.363421Z","caller":"etcdserver/server.go:2089","msg":"failed to publish local member to cluster through raft","local-member-id":"b8c6c7563d17d844","local-member-attributes":"{Name:ha-393000 ClientURLs:[https://192.169.0.5:2379]}","request-path":"/0/members/b8c6c7563d17d844/attributes","publish-timeout":"7s","error":"etcdserver: request timed out"}
	{"level":"warn","ts":"2024-07-31T17:04:55.539618Z","caller":"etcdhttp/health.go:232","msg":"serving /health false; no leader"}
	{"level":"warn","ts":"2024-07-31T17:04:55.539664Z","caller":"etcdhttp/health.go:119","msg":"/health error","output":"{\"health\":\"false\",\"reason\":\"RAFT NO LEADER\"}","status-code":503}
	{"level":"info","ts":"2024-07-31T17:04:56.510556Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.510829Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.511027Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.51112Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:56.511212Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306509Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306743Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.306923Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.307075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:04:58.307212Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404702Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404767Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"warn","ts":"2024-07-31T17:04:59.404769Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"1c40d7bfcdf14e3b","rtt":"0s","error":"dial tcp 192.169.0.6:2380: connect: connection refused"}
	{"level":"warn","ts":"2024-07-31T17:04:59.405991Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"cc1c22e219d8e152","rtt":"0s","error":"dial tcp 192.169.0.7:2380: connect: no route to host"}
	{"level":"info","ts":"2024-07-31T17:05:00.106932Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106958Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106967Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 received MsgPreVoteResp from b8c6c7563d17d844 at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106977Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to 1c40d7bfcdf14e3b at term 2"}
	{"level":"info","ts":"2024-07-31T17:05:00.106982Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b8c6c7563d17d844 [logterm: 2, index: 2014] sent MsgPreVote request to cc1c22e219d8e152 at term 2"}
	
	
	==> kernel <==
	 17:12:47 up 7 min,  0 users,  load average: 0.11, 0.22, 0.11
	Linux ha-393000 5.10.207 #1 SMP Mon Jul 29 15:19:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [193af4895baf] <==
	I0731 17:12:19.070587       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:12:19.070700       1 main.go:299] handling current node
	I0731 17:12:19.072035       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:12:19.072138       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:12:19.072294       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:12:19.072346       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:12:19.072443       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0731 17:12:19.072527       1 main.go:322] Node ha-393000-m05 has CIDR [10.244.3.0/24] 
	I0731 17:12:19.072580       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.3.0/24 Src: <nil> Gw: 192.169.0.9 Flags: [] Table: 0} 
	I0731 17:12:29.067499       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:12:29.067687       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:12:29.068024       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:12:29.068167       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:12:29.068319       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0731 17:12:29.068428       1 main.go:322] Node ha-393000-m05 has CIDR [10.244.3.0/24] 
	I0731 17:12:29.068600       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:12:29.068650       1 main.go:299] handling current node
	I0731 17:12:39.071971       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:12:39.072157       1 main.go:299] handling current node
	I0731 17:12:39.072214       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:12:39.072232       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:12:39.072412       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:12:39.072497       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:12:39.072667       1 main.go:295] Handling node with IPs: map[192.169.0.9:{}]
	I0731 17:12:39.072783       1 main.go:322] Node ha-393000-m05 has CIDR [10.244.3.0/24] 
	
	
	==> kindnet [8a97800bb665] <==
	I0731 16:59:40.110698       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 16:59:50.118349       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 16:59:50.118427       1 main.go:299] handling current node
	I0731 16:59:50.118450       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 16:59:50.118464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 16:59:50.118651       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 16:59:50.118739       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.118883       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:00.118987       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:00.119126       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:00.119236       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:00.119356       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:00.119483       1 main.go:299] handling current node
	I0731 17:00:10.110002       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:10.111054       1 main.go:299] handling current node
	I0731 17:00:10.111286       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:10.111319       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:10.111445       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:10.111480       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	I0731 17:00:20.116250       1 main.go:295] Handling node with IPs: map[192.169.0.5:{}]
	I0731 17:00:20.116442       1 main.go:299] handling current node
	I0731 17:00:20.116458       1 main.go:295] Handling node with IPs: map[192.169.0.6:{}]
	I0731 17:00:20.116464       1 main.go:322] Node ha-393000-m02 has CIDR [10.244.1.0/24] 
	I0731 17:00:20.116608       1 main.go:295] Handling node with IPs: map[192.169.0.7:{}]
	I0731 17:00:20.116672       1 main.go:322] Node ha-393000-m03 has CIDR [10.244.2.0/24] 
	
	
	==> kube-apiserver [7e0d32286913] <==
	I0731 17:05:50.070570       1 controller.go:80] Starting OpenAPI V3 AggregationController
	I0731 17:05:50.074783       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0731 17:05:50.074947       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:05:50.086677       1 crdregistration_controller.go:111] Starting crd-autoregister controller
	I0731 17:05:50.086708       1 shared_informer.go:313] Waiting for caches to sync for crd-autoregister
	I0731 17:05:50.117864       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0731 17:05:50.122120       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:05:50.122365       1 policy_source.go:224] refreshing policies
	I0731 17:05:50.132563       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0731 17:05:50.166384       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0731 17:05:50.168074       1 shared_informer.go:320] Caches are synced for configmaps
	I0731 17:05:50.168116       1 apf_controller.go:379] Running API Priority and Fairness config worker
	I0731 17:05:50.168122       1 apf_controller.go:382] Running API Priority and Fairness periodic rebalancing process
	I0731 17:05:50.170411       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0731 17:05:50.174248       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0731 17:05:50.178334       1 handler_discovery.go:447] Starting ResourceDiscoveryManager
	I0731 17:05:50.187980       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0731 17:05:50.188024       1 aggregator.go:165] initial CRD sync complete...
	I0731 17:05:50.188030       1 autoregister_controller.go:141] Starting autoregister controller
	I0731 17:05:50.188034       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0731 17:05:50.188038       1 cache.go:39] Caches are synced for autoregister controller
	E0731 17:05:50.205462       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0731 17:05:51.075340       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0731 17:06:47.219071       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0731 17:07:08.422863       1 controller.go:615] quota admission added evaluator for: endpoints
	
	
	==> kube-apiserver [aec44315311a] <==
	I0731 17:03:27.253147       1 options.go:221] external host was not specified, using 192.169.0.5
	I0731 17:03:27.253888       1 server.go:148] Version: v1.30.3
	I0731 17:03:27.253988       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:03:27.786353       1 shared_informer.go:313] Waiting for caches to sync for node_authorizer
	I0731 17:03:27.788898       1 shared_informer.go:313] Waiting for caches to sync for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0731 17:03:27.790619       1 plugins.go:157] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0731 17:03:27.790629       1 plugins.go:160] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0731 17:03:27.790778       1 instance.go:299] Using reconciler: lease
	W0731 17:03:47.786207       1 logging.go:59] [core] [Channel #1 SubChannel #3] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0731 17:03:47.786314       1 logging.go:59] [core] [Channel #2 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1:2379", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	F0731 17:03:47.791937       1 instance.go:292] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-controller-manager [0a6a6d756b8d] <==
	I0731 17:05:30.561595       1 serving.go:380] Generated self-signed cert in-memory
	I0731 17:05:31.250391       1 controllermanager.go:189] "Starting" version="v1.30.3"
	I0731 17:05:31.250471       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:05:31.252077       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0731 17:05:31.252281       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0731 17:05:31.252444       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0731 17:05:31.254793       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	E0731 17:05:51.257636       1 controllermanager.go:234] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: an error on the server (\"[+]ping ok\\n[+]log ok\\n[+]etcd ok\\n[+]poststarthook/start-apiserver-admission-initializer ok\\n[+]poststarthook/generic-apiserver-start-informers ok\\n[+]poststarthook/priority-and-fairness-config-consumer ok\\n[+]poststarthook/priority-and-fairness-filter ok\\n[+]poststarthook/storage-object-count-tracker-hook ok\\n[+]poststarthook/start-apiextensions-informers ok\\n[+]poststarthook/start-apiextensions-controllers ok\\n[+]poststarthook/crd-informer-synced ok\\n[+]poststarthook/start-service-ip-repair-controllers ok\\n[-]poststarthook/rbac/bootstrap-roles failed: reason withheld\\n[+]poststarthook/scheduling/bootstrap-system-priority-classes ok\\n[+]poststarthook/priority-and-fairness-config-producer ok\\n[+]poststarthook/start-system-namespaces-controller
ok\\n[+]poststarthook/bootstrap-controller ok\\n[+]poststarthook/start-cluster-authentication-info-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-controller ok\\n[+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok\\n[+]poststarthook/start-legacy-token-tracking-controller ok\\n[+]poststarthook/aggregator-reload-proxy-client-cert ok\\n[+]poststarthook/start-kube-aggregator-informers ok\\n[+]poststarthook/apiservice-registration-controller ok\\n[+]poststarthook/apiservice-status-available-controller ok\\n[+]poststarthook/apiservice-discovery-controller ok\\n[+]poststarthook/kube-apiserver-autoregistration ok\\n[+]autoregister-completion ok\\n[+]poststarthook/apiservice-openapi-controller ok\\n[+]poststarthook/apiservice-openapiv3-controller ok\\nhealthz check failed\") has prevented the request from succeeding"
	
	
	==> kube-controller-manager [42b34888f43b] <==
	I0731 17:06:12.920443       1 shared_informer.go:320] Caches are synced for endpoint_slice
	I0731 17:06:12.952902       1 shared_informer.go:320] Caches are synced for endpoint_slice_mirroring
	I0731 17:06:12.964558       1 shared_informer.go:320] Caches are synced for resource quota
	I0731 17:06:13.012295       1 shared_informer.go:320] Caches are synced for resource quota
	I0731 17:06:13.022225       1 shared_informer.go:320] Caches are synced for ClusterRoleAggregator
	I0731 17:06:13.501091       1 shared_informer.go:320] Caches are synced for garbage collector
	I0731 17:06:13.558892       1 shared_informer.go:320] Caches are synced for garbage collector
	I0731 17:06:13.559095       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0731 17:06:26.973668       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="55.100255ms"
	I0731 17:06:26.975840       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="54.971µs"
	I0731 17:06:29.221856       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="14.898144ms"
	I0731 17:06:29.222046       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="66.05µs"
	I0731 17:06:47.214265       1 endpointslice_controller.go:311] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mvtct\": the object has been modified; please apply your changes to the latest version and try again"
	I0731 17:06:47.214807       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"a0a921f4-5219-42ca-94c6-a4038d9ff710", APIVersion:"v1", ResourceVersion:"259", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mvtct": the object has been modified; please apply your changes to the latest version and try again
	I0731 17:06:47.241205       1 endpointslice_controller.go:311] "Error syncing endpoint slices for service, retrying" logger="endpointslice-controller" key="kube-system/kube-dns" err="failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-mvtct\": the object has been modified; please apply your changes to the latest version and try again"
	I0731 17:06:47.241526       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="78.18352ms"
	I0731 17:06:47.241539       1 event.go:377] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"a0a921f4-5219-42ca-94c6-a4038d9ff710", APIVersion:"v1", ResourceVersion:"259", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-mvtct EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-mvtct": the object has been modified; please apply your changes to the latest version and try again
	E0731 17:06:47.241671       1 replica_set.go:557] sync "kube-system/coredns-7db6d8ff4d" failed with Operation cannot be fulfilled on replicasets.apps "coredns-7db6d8ff4d": the object has been modified; please apply your changes to the latest version and try again
	I0731 17:06:47.242012       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="316.596µs"
	I0731 17:06:47.246958       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="100.8µs"
	I0731 17:06:47.288893       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="32.237881ms"
	I0731 17:06:47.289070       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.102µs"
	I0731 17:12:09.257842       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-393000-m05\" does not exist"
	I0731 17:12:09.279224       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-393000-m05" podCIDRs=["10.244.3.0/24"]
	I0731 17:12:12.913824       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-393000-m05"
	
	
	==> kube-proxy [4f56054bbee1] <==
	I0731 17:06:08.426782       1 server_linux.go:69] "Using iptables proxy"
	I0731 17:06:08.446564       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 17:06:08.497695       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 17:06:08.497829       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 17:06:08.497985       1 server_linux.go:165] "Using iptables Proxier"
	I0731 17:06:08.502095       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 17:06:08.503040       1 server.go:872] "Version info" version="v1.30.3"
	I0731 17:06:08.503116       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 17:06:08.506909       1 config.go:192] "Starting service config controller"
	I0731 17:06:08.507443       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 17:06:08.507578       1 config.go:319] "Starting node config controller"
	I0731 17:06:08.507600       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 17:06:08.509126       1 config.go:101] "Starting endpoint slice config controller"
	I0731 17:06:08.509154       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 17:06:08.607797       1 shared_informer.go:320] Caches are synced for node config
	I0731 17:06:08.607880       1 shared_informer.go:320] Caches are synced for service config
	I0731 17:06:08.610417       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-proxy [d9a1703e5ccc] <==
	I0731 16:54:06.341341       1 server_linux.go:69] "Using iptables proxy"
	I0731 16:54:06.356149       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.169.0.5"]
	I0731 16:54:06.392472       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0731 16:54:06.392493       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0731 16:54:06.392504       1 server_linux.go:165] "Using iptables Proxier"
	I0731 16:54:06.396865       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0731 16:54:06.397004       1 server.go:872] "Version info" version="v1.30.3"
	I0731 16:54:06.397016       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0731 16:54:06.407050       1 config.go:192] "Starting service config controller"
	I0731 16:54:06.407126       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0731 16:54:06.407291       1 config.go:101] "Starting endpoint slice config controller"
	I0731 16:54:06.407378       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0731 16:54:06.407924       1 config.go:319] "Starting node config controller"
	I0731 16:54:06.409039       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0731 16:54:06.507543       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0731 16:54:06.507706       1 shared_informer.go:320] Caches are synced for service config
	I0731 16:54:06.509461       1 shared_informer.go:320] Caches are synced for node config
	
	
	==> kube-scheduler [a34d35a3b612] <==
	I0731 17:12:09.314226       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-vp778" node="ha-393000-m05"
	E0731 17:12:09.315516       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-bc6mj\": pod kindnet-bc6mj is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kindnet-bc6mj" node="ha-393000-m05"
	E0731 17:12:09.317311       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-bc6mj\": pod kindnet-bc6mj is already assigned to node \"ha-393000-m05\"" pod="kube-system/kindnet-bc6mj"
	I0731 17:12:09.317534       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-bc6mj" node="ha-393000-m05"
	E0731 17:12:09.329195       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kindnet-2vcxs\": pod kindnet-2vcxs is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kindnet-2vcxs" node="ha-393000-m05"
	E0731 17:12:09.329378       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kindnet-2vcxs\": pod kindnet-2vcxs is already assigned to node \"ha-393000-m05\"" pod="kube-system/kindnet-2vcxs"
	I0731 17:12:09.329590       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kindnet-2vcxs" node="ha-393000-m05"
	E0731 17:12:09.334254       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-rj84b\": pod kube-proxy-rj84b is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-rj84b" node="ha-393000-m05"
	E0731 17:12:09.334331       1 schedule_one.go:338] "scheduler cache ForgetPod failed" err="pod 16656a49-d07d-4c64-b134-5205c00964dd(kube-system/kube-proxy-rj84b) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-rj84b"
	E0731 17:12:09.334345       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-rj84b\": pod kube-proxy-rj84b is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-rj84b"
	I0731 17:12:09.334357       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-rj84b" node="ha-393000-m05"
	E0731 17:12:09.371198       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-kd8cb\": pod kube-proxy-kd8cb is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-kd8cb" node="ha-393000-m05"
	E0731 17:12:09.371261       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-kd8cb\": pod kube-proxy-kd8cb is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-kd8cb"
	E0731 17:12:10.476375       1 schedule_one.go:942] "Scheduler cache AssumePod failed" err="pod 5d11dce5-e8d2-44b5-9253-a247d8fdc231(kube-system/kube-proxy-kd8cb) is in the cache, so can't be assumed" pod="kube-system/kube-proxy-kd8cb"
	E0731 17:12:10.476672       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="pod 5d11dce5-e8d2-44b5-9253-a247d8fdc231(kube-system/kube-proxy-kd8cb) is in the cache, so can't be assumed" pod="kube-system/kube-proxy-kd8cb"
	I0731 17:12:10.476834       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-kd8cb" node="ha-393000-m05"
	E0731 17:12:10.795259       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-kdqqk\": pod kube-proxy-kdqqk is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-kdqqk" node="ha-393000-m05"
	E0731 17:12:10.796390       1 schedule_one.go:338] "scheduler cache ForgetPod failed" err="pod cfa57630-31cd-4723-af07-bedc5ba840bb(kube-system/kube-proxy-kdqqk) wasn't assumed so cannot be forgotten" pod="kube-system/kube-proxy-kdqqk"
	E0731 17:12:10.797071       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-kdqqk\": pod kube-proxy-kdqqk is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-kdqqk"
	I0731 17:12:10.797784       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-kdqqk" node="ha-393000-m05"
	E0731 17:12:12.552266       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-z758q\": pod kube-proxy-z758q is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-z758q" node="ha-393000-m05"
	E0731 17:12:12.552426       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-z758q\": pod kube-proxy-z758q is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-z758q"
	E0731 17:12:12.552850       1 framework.go:1286] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-8vlbk\": pod kube-proxy-8vlbk is already assigned to node \"ha-393000-m05\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-8vlbk" node="ha-393000-m05"
	E0731 17:12:12.553110       1 schedule_one.go:1046] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-8vlbk\": pod kube-proxy-8vlbk is already assigned to node \"ha-393000-m05\"" pod="kube-system/kube-proxy-8vlbk"
	I0731 17:12:12.553266       1 schedule_one.go:1059] "Pod has been assigned to node. Abort adding it back to queue." pod="kube-system/kube-proxy-8vlbk" node="ha-393000-m05"
	
	
	==> kube-scheduler [d088fefe5f8e] <==
	E0731 17:04:26.658553       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:28.887716       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:28.887806       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.169.0.5:8443/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:32.427417       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:32.427586       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.169.0.5:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:36.436787       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:36.436870       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.169.0.5:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:40.022061       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:40.022227       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://192.169.0.5:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:43.471012       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:43.471291       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.169.0.5:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:43.930296       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:43.930321       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.169.0.5:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:44.041999       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:44.042358       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.169.0.5:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:48.230649       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:48.230983       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get "https://192.169.0.5:8443/api/v1/namespaces/kube-system/configmaps?fieldSelector=metadata.name%!D(MISSING)extension-apiserver-authentication&limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	W0731 17:04:58.373439       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:04:58.373554       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.169.0.5:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.169.0.5:8443: connect: connection refused
	E0731 17:05:00.249019       1 server.go:214] "waiting for handlers to sync" err="context canceled"
	I0731 17:05:00.249450       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	I0731 17:05:00.249577       1 secure_serving.go:258] Stopped listening on 127.0.0.1:10259
	E0731 17:05:00.249641       1 shared_informer.go:316] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0731 17:05:00.249670       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	E0731 17:05:00.249984       1 run.go:74] "command failed" err="finished without leader elect"
	
	
	==> kubelet <==
	Jul 31 17:08:22 ha-393000 kubelet[1592]: E0731 17:08:22.903462    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:08:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:08:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:09:22 ha-393000 kubelet[1592]: E0731 17:09:22.903125    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:09:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:09:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:10:22 ha-393000 kubelet[1592]: E0731 17:10:22.903858    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:10:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:10:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:11:22 ha-393000 kubelet[1592]: E0731 17:11:22.902625    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:11:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:11:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:11:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:11:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 31 17:12:22 ha-393000 kubelet[1592]: E0731 17:12:22.904610    1592 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 31 17:12:22 ha-393000 kubelet[1592]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 31 17:12:22 ha-393000 kubelet[1592]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 31 17:12:22 ha-393000 kubelet[1592]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 31 17:12:22 ha-393000 kubelet[1592]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p ha-393000 -n ha-393000
helpers_test.go:261: (dbg) Run:  kubectl --context ha-393000 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeAdd (4.88s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
json_output_test.go:114: step 9 has already been assigned to another step:
Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
Cannot use for:
Deleting "json-output-614000" in hyperkit ...
[Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: be59b9f3-0df0-4ab4-b1bd-44d2f26da44f
datacontenttype: application/json
Data,
{
"currentstep": "0",
"message": "[json-output-614000] minikube v1.33.1 on Darwin 14.5",
"name": "Initial Minikube Setup",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: 9d749e21-c5cd-4abe-9c37-729471201924
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_LOCATION=19349"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: c2ff4586-994a-425e-bb27-2a9a0ef4f877
datacontenttype: application/json
Data,
{
"message": "KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: cfb9d1f0-e612-41f6-b755-49ec9d813fbf
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_BIN=out/minikube-darwin-amd64"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: f5ef942d-652e-4951-8c6c-0947176ca052
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: ed44b1e7-1acc-41de-917b-d3c0d2ab2fb7
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: 36075686-c960-4f6d-b529-8561176673a7
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_FORCE_SYSTEMD="
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 7c67104d-d280-4849-aaf2-4a32c6142322
datacontenttype: application/json
Data,
{
"currentstep": "1",
"message": "Using the hyperkit driver based on user configuration",
"name": "Selecting Driver",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: c444d9e6-3e9c-41d2-b78b-df31d1d01457
datacontenttype: application/json
Data,
{
"currentstep": "3",
"message": "Starting \"json-output-614000\" primary control-plane node in \"json-output-614000\" cluster",
"name": "Starting Node",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: c106b9fc-e4b4-41f6-8b5d-06459d2930d0
datacontenttype: application/json
Data,
{
"currentstep": "9",
"message": "Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...",
"name": "Creating VM",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: d7415ad6-c460-4541-a4fc-beceb07a5f01
datacontenttype: application/json
Data,
{
"currentstep": "9",
"message": "Deleting \"json-output-614000\" in hyperkit ...",
"name": "Creating VM",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.error
source: https://minikube.sigs.k8s.io/
id: 64971b27-2a6c-4b84-8a35-68e52d805004
datacontenttype: application/json
Data,
{
"message": "StartHost failed, but will try again: creating host: create host timed out in 360.000000 seconds"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 69fd4d97-93bc-4771-b75a-673532848fa2
datacontenttype: application/json
Data,
{
"currentstep": "9",
"message": "Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...",
"name": "Creating VM",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 95ed5d4c-91c5-4935-ae77-15c6b36eab60
datacontenttype: application/json
Data,
{
"currentstep": "11",
"message": "Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...",
"name": "Preparing Kubernetes",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 2d1eb01b-69c3-4540-a885-a4b9fa7b39e7
datacontenttype: application/json
Data,
{
"currentstep": "12",
"message": "Generating certificates and keys ...",
"name": "Generating certificates",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: fc5342d6-f388-48a4-94ca-7e927609cadc
datacontenttype: application/json
Data,
{
"currentstep": "13",
"message": "Booting up control plane ...",
"name": "Booting control plane",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: ee2184ba-c4d4-4446-a425-f0336845a673
datacontenttype: application/json
Data,
{
"currentstep": "14",
"message": "Configuring RBAC rules ...",
"name": "Configuring RBAC rules",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 35980521-365e-4083-803d-c30df1f60b7c
datacontenttype: application/json
Data,
{
"currentstep": "15",
"message": "Configuring bridge CNI (Container Networking Interface) ...",
"name": "Configuring CNI",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: c98a7ce0-7f9a-46e6-91d6-f8cbb84f3dbf
datacontenttype: application/json
Data,
{
"currentstep": "17",
"message": "Verifying Kubernetes components...",
"name": "Verifying Kubernetes",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: 1915ed3e-ee7a-4540-b8e9-1dd6a49ce891
datacontenttype: application/json
Data,
{
"message": "Using image gcr.io/k8s-minikube/storage-provisioner:v5"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 1d6c01ba-6395-48ee-b85d-44a4a30b574d
datacontenttype: application/json
Data,
{
"currentstep": "18",
"message": "Enabled addons: storage-provisioner, default-storageclass",
"name": "Enabling Addons",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 68c40977-4a11-4444-ae8a-c67557fb7fb4
datacontenttype: application/json
Data,
{
"currentstep": "19",
"message": "Done! kubectl is now configured to use \"json-output-614000\" cluster and \"default\" namespace by default",
"name": "Done",
"totalsteps": "19"
}
]
--- FAIL: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
json_output_test.go:144: current step is not in increasing order: [Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: be59b9f3-0df0-4ab4-b1bd-44d2f26da44f
datacontenttype: application/json
Data,
{
"currentstep": "0",
"message": "[json-output-614000] minikube v1.33.1 on Darwin 14.5",
"name": "Initial Minikube Setup",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: 9d749e21-c5cd-4abe-9c37-729471201924
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_LOCATION=19349"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: c2ff4586-994a-425e-bb27-2a9a0ef4f877
datacontenttype: application/json
Data,
{
"message": "KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: cfb9d1f0-e612-41f6-b755-49ec9d813fbf
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_BIN=out/minikube-darwin-amd64"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: f5ef942d-652e-4951-8c6c-0947176ca052
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: ed44b1e7-1acc-41de-917b-d3c0d2ab2fb7
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: 36075686-c960-4f6d-b529-8561176673a7
datacontenttype: application/json
Data,
{
"message": "MINIKUBE_FORCE_SYSTEMD="
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 7c67104d-d280-4849-aaf2-4a32c6142322
datacontenttype: application/json
Data,
{
"currentstep": "1",
"message": "Using the hyperkit driver based on user configuration",
"name": "Selecting Driver",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: c444d9e6-3e9c-41d2-b78b-df31d1d01457
datacontenttype: application/json
Data,
{
"currentstep": "3",
"message": "Starting \"json-output-614000\" primary control-plane node in \"json-output-614000\" cluster",
"name": "Starting Node",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: c106b9fc-e4b4-41f6-8b5d-06459d2930d0
datacontenttype: application/json
Data,
{
"currentstep": "9",
"message": "Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...",
"name": "Creating VM",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: d7415ad6-c460-4541-a4fc-beceb07a5f01
datacontenttype: application/json
Data,
{
"currentstep": "9",
"message": "Deleting \"json-output-614000\" in hyperkit ...",
"name": "Creating VM",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.error
source: https://minikube.sigs.k8s.io/
id: 64971b27-2a6c-4b84-8a35-68e52d805004
datacontenttype: application/json
Data,
{
"message": "StartHost failed, but will try again: creating host: create host timed out in 360.000000 seconds"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 69fd4d97-93bc-4771-b75a-673532848fa2
datacontenttype: application/json
Data,
{
"currentstep": "9",
"message": "Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...",
"name": "Creating VM",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 95ed5d4c-91c5-4935-ae77-15c6b36eab60
datacontenttype: application/json
Data,
{
"currentstep": "11",
"message": "Preparing Kubernetes v1.30.3 on Docker 27.1.1 ...",
"name": "Preparing Kubernetes",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 2d1eb01b-69c3-4540-a885-a4b9fa7b39e7
datacontenttype: application/json
Data,
{
"currentstep": "12",
"message": "Generating certificates and keys ...",
"name": "Generating certificates",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: fc5342d6-f388-48a4-94ca-7e927609cadc
datacontenttype: application/json
Data,
{
"currentstep": "13",
"message": "Booting up control plane ...",
"name": "Booting control plane",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: ee2184ba-c4d4-4446-a425-f0336845a673
datacontenttype: application/json
Data,
{
"currentstep": "14",
"message": "Configuring RBAC rules ...",
"name": "Configuring RBAC rules",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 35980521-365e-4083-803d-c30df1f60b7c
datacontenttype: application/json
Data,
{
"currentstep": "15",
"message": "Configuring bridge CNI (Container Networking Interface) ...",
"name": "Configuring CNI",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: c98a7ce0-7f9a-46e6-91d6-f8cbb84f3dbf
datacontenttype: application/json
Data,
{
"currentstep": "17",
"message": "Verifying Kubernetes components...",
"name": "Verifying Kubernetes",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.info
source: https://minikube.sigs.k8s.io/
id: 1915ed3e-ee7a-4540-b8e9-1dd6a49ce891
datacontenttype: application/json
Data,
{
"message": "Using image gcr.io/k8s-minikube/storage-provisioner:v5"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 1d6c01ba-6395-48ee-b85d-44a4a30b574d
datacontenttype: application/json
Data,
{
"currentstep": "18",
"message": "Enabled addons: storage-provisioner, default-storageclass",
"name": "Enabling Addons",
"totalsteps": "19"
}
Context Attributes,
specversion: 1.0
type: io.k8s.sigs.minikube.step
source: https://minikube.sigs.k8s.io/
id: 68c40977-4a11-4444-ae8a-c67557fb7fb4
datacontenttype: application/json
Data,
{
"currentstep": "19",
"message": "Done! kubectl is now configured to use \"json-output-614000\" cluster and \"default\" namespace by default",
"name": "Done",
"totalsteps": "19"
}
]
--- FAIL: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (136.68s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-499000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
E0731 10:23:27.206655    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
mount_start_test.go:98: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p mount-start-1-499000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : exit status 80 (2m16.600364756s)

                                                
                                                
-- stdout --
	* [mount-start-1-499000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting minikube without Kubernetes in cluster mount-start-1-499000
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "mount-start-1-499000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for a2:b7:ba:fb:97:a6
	* Failed to start hyperkit VM. Running "minikube delete -p mount-start-1-499000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 86:f1:3e:c5:6b:92
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 86:f1:3e:c5:6b:92
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
mount_start_test.go:100: failed to start minikube with args: "out/minikube-darwin-amd64 start -p mount-start-1-499000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit " : exit status 80
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p mount-start-1-499000 -n mount-start-1-499000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p mount-start-1-499000 -n mount-start-1-499000: exit status 7 (76.979757ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 10:25:40.881457    4361 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 10:25:40.881479    4361 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "mount-start-1-499000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestMountStart/serial/StartWithMountFirst (136.68s)

                                                
                                    
x
+
TestScheduledStopUnix (141.87s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-735000 --memory=2048 --driver=hyperkit 
scheduled_stop_test.go:128: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p scheduled-stop-735000 --memory=2048 --driver=hyperkit : exit status 80 (2m16.549545883s)

                                                
                                                
-- stdout --
	* [scheduled-stop-735000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "scheduled-stop-735000" primary control-plane node in "scheduled-stop-735000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "scheduled-stop-735000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 7e:e1:7e:65:b:f6
	* Failed to start hyperkit VM. Running "minikube delete -p scheduled-stop-735000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 6:2d:d:24:29:b1
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 6:2d:d:24:29:b1
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
scheduled_stop_test.go:130: starting minikube: exit status 80

                                                
                                                
-- stdout --
	* [scheduled-stop-735000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "scheduled-stop-735000" primary control-plane node in "scheduled-stop-735000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "scheduled-stop-735000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 7e:e1:7e:65:b:f6
	* Failed to start hyperkit VM. Running "minikube delete -p scheduled-stop-735000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 6:2d:d:24:29:b1
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 6:2d:d:24:29:b1
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
panic.go:626: *** TestScheduledStopUnix FAILED at 2024-07-31 10:41:37.257508 -0700 PDT m=+3742.244890551
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-735000 -n scheduled-stop-735000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-735000 -n scheduled-stop-735000: exit status 7 (77.757609ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 10:41:37.332928    5482 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 10:41:37.332951    5482 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "scheduled-stop-735000" host is not running, skipping log retrieval (state="Error")
helpers_test.go:175: Cleaning up "scheduled-stop-735000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-735000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p scheduled-stop-735000: (5.243914316s)
--- FAIL: TestScheduledStopUnix (141.87s)

                                                
                                    
x
+
TestPause/serial/Start (141.65s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-649000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 
pause_test.go:80: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p pause-649000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : exit status 80 (2m21.569654606s)

                                                
                                                
-- stdout --
	* [pause-649000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting "pause-649000" primary control-plane node in "pause-649000" cluster
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	* Deleting "pause-649000" in hyperkit ...
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! StartHost failed, but will try again: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 96:fa:c4:e5:74:14
	* Failed to start hyperkit VM. Running "minikube delete -p pause-649000" may fix it: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4a:2d:2b:bd:16:a1
	X Exiting due to GUEST_PROVISION: error provisioning guest: Failed to start host: creating host: create: Error creating machine: Error in driver during machine creation: IP address never found in dhcp leases file Temporary error: could not find an IP address for 4a:2d:2b:bd:16:a1
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
pause_test.go:82: failed to start minikube with args: "out/minikube-darwin-amd64 start -p pause-649000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit " : exit status 80
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-649000 -n pause-649000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p pause-649000 -n pause-649000: exit status 7 (77.828878ms)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0731 10:50:26.880321    6088 status.go:352] failed to get driver ip: getting IP: IP address is not set
	E0731 10:50:26.880343    6088 status.go:249] status error: getting IP: IP address is not set

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "pause-649000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestPause/serial/Start (141.65s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (7201.643s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-394000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.30.3
E0731 11:38:44.803147    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/enable-default-cni-390000/client.crt: no such file or directory
E0731 11:39:12.492226    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/enable-default-cni-390000/client.crt: no such file or directory
panic: test timed out after 2h0m0s
running tests:
	TestNetworkPlugins (55m39s)
	TestNetworkPlugins/group (7m14s)
	TestStartStop (51m10s)
	TestStartStop/group/embed-certs (47s)
	TestStartStop/group/embed-certs/serial (47s)
	TestStartStop/group/embed-certs/serial/FirstStart (47s)
	TestStartStop/group/old-k8s-version (8m39s)
	TestStartStop/group/old-k8s-version/serial (8m39s)
	TestStartStop/group/old-k8s-version/serial/SecondStart (5m34s)

                                                
                                                
goroutine 3886 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2366 +0x385
created by time.goFunc
	/usr/local/go/src/time/sleep.go:177 +0x2d

                                                
                                                
goroutine 1 [chan receive, 17 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1650 +0x4ab
testing.tRunner(0xc0006a51e0, 0xc000adbbb0)
	/usr/local/go/src/testing/testing.go:1695 +0x134
testing.runTests(0xc000010348, {0xd9a8ae0, 0x2a, 0x2a}, {0x9479825?, 0xafb30f0?, 0xd9cbaa0?})
	/usr/local/go/src/testing/testing.go:2159 +0x445
testing.(*M).Run(0xc0009f45a0)
	/usr/local/go/src/testing/testing.go:2027 +0x68b
k8s.io/minikube/test/integration.TestMain(0xc0009f45a0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/main_test.go:62 +0x8b
main.main()
	_testmain.go:131 +0x195

                                                
                                                
goroutine 35 [select]:
go.opencensus.io/stats/view.(*worker).start(0xc000904080)
	/var/lib/jenkins/go/pkg/mod/go.opencensus.io@v0.24.0/stats/view/worker.go:292 +0x9f
created by go.opencensus.io/stats/view.init.0 in goroutine 1
	/var/lib/jenkins/go/pkg/mod/go.opencensus.io@v0.24.0/stats/view/worker.go:34 +0x8d

                                                
                                                
goroutine 2910 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0xc001bd5e10, 0x12)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0017d05a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001bd5e40)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0002bad70, {0xc622c00, 0xc0014b4ff0}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0002bad70, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2939
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2904 [chan receive, 14 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc0008f8e00, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2922
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 173 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000b69750, 0xc000c7ff98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x0?, 0xc000b69750, 0xc000b69798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 187
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 172 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0xc000aeed90, 0x2d)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc00091f680)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc000aeee80)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000c36000, {0xc622c00, 0xc000015d10}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000c36000, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 187
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 28 [select]:
k8s.io/klog/v2.(*flushDaemon).run.func1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/klog/v2@v2.130.1/klog.go:1141 +0x117
created by k8s.io/klog/v2.(*flushDaemon).run in goroutine 27
	/var/lib/jenkins/go/pkg/mod/k8s.io/klog/v2@v2.130.1/klog.go:1137 +0x171

                                                
                                                
goroutine 187 [chan receive, 117 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc000aeee80, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 185
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 938 [chan receive, 107 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001f36980, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 866
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 174 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 173
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3256 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0xc001bd4750, 0x12)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc00091fb00)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001bd4780)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000555e90, {0xc622c00, 0xc0014b4d80}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000555e90, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3278
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 186 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc00091f7a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 185
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3612 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0017d15c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3608
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2671 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000b6f750, 0xc000b6f798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x30?, 0xc000b6f750, 0xc000b6f798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0xc00133f380?, 0x94ed6a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000b6f7d0?, 0x95339a4?, 0xc0000cc6c0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2682
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 937 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0015d2f60)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 866
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2938 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0017d06c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2937
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3826 [IO wait]:
internal/poll.runtime_pollWait(0x5525c828, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc00215ccc0?, 0xc0016bd142?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc00215ccc0, {0xc0016bd142, 0x1aebe, 0x1aebe})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc001e14cf0, {0xc0016bd142?, 0xc0000fea80?, 0x1ff20?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc001890630, {0xc621618, 0xc00009b4f8})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xc621758, 0xc001890630}, {0xc621618, 0xc00009b4f8}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0xc0017a3678?, {0xc621758, 0xc001890630})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xc0017a3738?, {0xc621758?, 0xc001890630?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xc621758, 0xc001890630}, {0xc6216d8, 0xc001e14cf0}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:578 +0x34
os/exec.(*Cmd).Start.func2(0xc001855620?)
	/usr/local/go/src/os/exec/exec.go:728 +0x2c
created by os/exec.(*Cmd).Start in goroutine 3824
	/usr/local/go/src/os/exec/exec.go:727 +0x9ae

                                                
                                                
goroutine 1155 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0xc001a0ac00, 0xc0013faea0)
	/usr/local/go/src/os/exec/exec.go:793 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 857
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 2091 [chan receive, 56 minutes]:
testing.(*T).Run(0xc0013cc820, {0xaf5962a?, 0x38151022c49?}, 0xc001eec120)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestNetworkPlugins(0xc0013cc820)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/net_test.go:52 +0xd4
testing.tRunner(0xc0013cc820, 0xc6169f8)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 1109 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0xc00167d200, 0xc00188fb00)
	/usr/local/go/src/os/exec/exec.go:793 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1108
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 2700 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc000b88660)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2688
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2701 [chan receive, 15 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001f36d80, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2688
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 921 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 920
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3620 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc00179f750, 0xc00179f798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x0?, 0xc00179f750, 0xc00179f798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc00179f7d0?, 0x99b5ce5?, 0xc0017d15c0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3613
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3825 [IO wait, 5 minutes]:
internal/poll.runtime_pollWait(0x5525c638, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc00215cc00?, 0xc001793cd5?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc00215cc00, {0xc001793cd5, 0x32b, 0x32b})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc001e14cd8, {0xc001793cd5?, 0x9531b3a?, 0x263?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc001890600, {0xc621618, 0xc00009b4e8})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xc621758, 0xc001890600}, {0xc621618, 0xc00009b4e8}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0xd8dc980?, {0xc621758, 0xc001890600})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xf?, {0xc621758?, 0xc001890600?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xc621758, 0xc001890600}, {0xc6216d8, 0xc001e14cd8}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:578 +0x34
os/exec.(*Cmd).Start.func2(0xc001633100?)
	/usr/local/go/src/os/exec/exec.go:728 +0x2c
created by os/exec.(*Cmd).Start in goroutine 3824
	/usr/local/go/src/os/exec/exec.go:727 +0x9ae

                                                
                                                
goroutine 2708 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0xc001f36d50, 0x12)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc000b88540)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001f36d80)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001ee4720, {0xc622c00, 0xc001c68ba0}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001ee4720, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2701
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 739 [IO wait, 111 minutes]:
internal/poll.runtime_pollWait(0x5525d1d8, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc001632200?, 0x3fe?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0xc001632200)
	/usr/local/go/src/internal/poll/fd_unix.go:611 +0x2ac
net.(*netFD).accept(0xc001632200)
	/usr/local/go/src/net/fd_unix.go:172 +0x29
net.(*TCPListener).accept(0xc000ae2060)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x1e
net.(*TCPListener).Accept(0xc000ae2060)
	/usr/local/go/src/net/tcpsock.go:327 +0x30
net/http.(*Server).Serve(0xc0004300f0, {0xc6397f0, 0xc000ae2060})
	/usr/local/go/src/net/http/server.go:3260 +0x33e
net/http.(*Server).ListenAndServe(0xc0004300f0)
	/usr/local/go/src/net/http/server.go:3189 +0x71
k8s.io/minikube/test/integration.startHTTPProxy.func1(0xd?, 0xc0006a56c0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/functional_test.go:2209 +0x18
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 736
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/functional_test.go:2208 +0x129

                                                
                                                
goroutine 3782 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3781
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3171 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000b72750, 0xc000b72798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x10?, 0xc000b72750, 0xc000b72798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0xc00133fd40?, 0x94ed6a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000b727d0?, 0x95339a4?, 0xc0013ab9e0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3157
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 1193 [select, 107 minutes]:
net/http.(*persistConn).readLoop(0xc001a09680)
	/usr/local/go/src/net/http/transport.go:2261 +0xd3a
created by net/http.(*Transport).dialConn in goroutine 1206
	/usr/local/go/src/net/http/transport.go:1799 +0x152f

                                                
                                                
goroutine 2374 [chan receive]:
testing.(*T).Run(0xc0013cdba0, {0xaf5ac7a?, 0x0?}, 0xc001633e80)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc0013cdba0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:130 +0xad9
testing.tRunner(0xc0013cdba0, 0xc001f36300)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2352
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3277 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc00091fe00)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3273
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2369 [chan receive, 9 minutes]:
testing.(*T).Run(0xc0013ccea0, {0xaf5ac7a?, 0x0?}, 0xc0007fd380)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc0013ccea0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:130 +0xad9
testing.tRunner(0xc0013ccea0, 0xc001f36180)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2352
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 942 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0xc000223c80, 0xc001e1d560)
	/usr/local/go/src/os/exec/exec.go:793 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 941
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 919 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0xc001f36950, 0x2b)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0015d2e40)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001f36980)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001e96320, {0xc622c00, 0xc000c33380}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001e96320, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 938
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2912 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2911
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2939 [chan receive, 14 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001bd5e40, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2937
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2911 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc0017a2f50, 0xc0017a2f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0xfc?, 0xc0017a2f50, 0xc0017a2f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0x993f016?, 0xc000223500?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000223500?, 0x993c525?, 0xc001b3a600?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2939
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3754 [chan receive, 7 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001bd55c0, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3749
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 920 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000502f50, 0xc000c63f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x80?, 0xc000502f50, 0xc000502f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0xc0006a56c0?, 0x94ed6a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000502fd0?, 0x95339a4?, 0xc0015c2d80?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 938
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 1183 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0xc001b06000, 0xc001951320)
	/usr/local/go/src/os/exec/exec.go:793 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1182
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 1194 [select, 107 minutes]:
net/http.(*persistConn).writeLoop(0xc001a09680)
	/usr/local/go/src/net/http/transport.go:2458 +0xf0
created by net/http.(*Transport).dialConn in goroutine 1206
	/usr/local/go/src/net/http/transport.go:1800 +0x1585

                                                
                                                
goroutine 3258 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3257
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3382 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000c6f750, 0xc000c6f798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x0?, 0xc000c6f750, 0xc000c6f798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3368
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2670 [sync.Cond.Wait, 5 minutes]:
sync.runtime_notifyListWait(0xc001e703d0, 0x12)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0017d0960)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001e70400)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0002ba000, {0xc622c00, 0xc001bb2000}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0002ba000, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2682
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2927 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000502f50, 0xc000502f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x70?, 0xc000502f50, 0xc000502f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0xc0006a5d40?, 0x94ed6a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000502fd0?, 0x95339a4?, 0xc001cbab10?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2904
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2155 [chan receive, 51 minutes]:
testing.(*T).Run(0xc0013cc680, {0xaf5962a?, 0x94ecd73?}, 0xc616ba0)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop(0xc0013cc680)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:46 +0x35
testing.tRunner(0xc0013cc680, 0xc616a40)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3257 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc00179ef50, 0xc00179ef98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x20?, 0xc00179ef50, 0xc00179ef98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0xc001bba4e0?, 0x94ed6a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x9533945?, 0xc001870300?, 0xc000059920?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3278
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2143 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000b73750, 0xc000c54f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0xd0?, 0xc000b73750, 0xc000b73798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0xc000b737b0?, 0x975c858?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000b737d0?, 0x95339a4?, 0xc000222a80?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2151
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2371 [chan receive, 51 minutes]:
testing.(*testContext).waitParallel(0xc000724780)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.(*T).Parallel(0xc0013cd6c0)
	/usr/local/go/src/testing/testing.go:1484 +0x229
k8s.io/minikube/test/integration.MaybeParallel(0xc0013cd6c0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:483 +0x34
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc0013cd6c0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:94 +0x45
testing.tRunner(0xc0013cd6c0, 0xc001f36200)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2352
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 2672 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2671
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2682 [chan receive, 15 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001e70400, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2680
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2150 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc000b8e8a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2129
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3721 [chan receive, 7 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc000149040, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3710
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2142 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0xc0007facd0, 0x1e)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc000b8e780)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc0007fad00)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000718070, {0xc622c00, 0xc0014b6090}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000718070, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2151
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2151 [chan receive, 56 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc0007fad00, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2129
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2144 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2143
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2352 [chan receive, 51 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1650 +0x4ab
testing.tRunner(0xc0013cc000, 0xc616ba0)
	/usr/local/go/src/testing/testing.go:1695 +0x134
created by testing.(*T).Run in goroutine 2155
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3506 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc000b8ee40)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3498
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2180 [chan receive, 7 minutes]:
testing.(*testContext).waitParallel(0xc000724780)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1665 +0x5e9
testing.tRunner(0xc00133e1a0, 0xc001eec120)
	/usr/local/go/src/testing/testing.go:1695 +0x134
created by testing.(*T).Run in goroutine 2091
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3619 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0xc00087fd90, 0xf)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0017d14a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc00087fdc0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0018fe750, {0xc622c00, 0xc000752240}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0018fe750, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3613
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3592 [chan receive, 7 minutes]:
testing.(*T).Run(0xc00075bba0, {0xaf6679d?, 0x60400000004?}, 0xc001633100)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0xc00075bba0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:155 +0x2af
testing.tRunner(0xc00075bba0, 0xc0007fd380)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2369
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3720 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0016184e0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3710
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3732 [select]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3731
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3368 [chan receive, 10 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc0007fb000, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3366
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3731 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000c6a750, 0xc000c6a798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x50?, 0xc000c6a750, 0xc000c6a798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0xc00075ba00?, 0x94ed6a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000c6a7d0?, 0x95339a4?, 0xc00150ecf0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3721
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3780 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0xc001bd5590, 0x1)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001b84780)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001bd55c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001ca6f50, {0xc622c00, 0xc001af53e0}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001ca6f50, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3754
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3157 [chan receive, 12 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc00087ee80, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3150
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2681 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0017d0a80)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2680
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2926 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0xc0008f8d90, 0x12)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001b85860)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc0008f8e00)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001ab08d0, {0xc622c00, 0xc001696870}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001ab08d0, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2904
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2370 [chan receive, 51 minutes]:
testing.(*testContext).waitParallel(0xc000724780)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.(*T).Parallel(0xc0013cd520)
	/usr/local/go/src/testing/testing.go:1484 +0x229
k8s.io/minikube/test/integration.MaybeParallel(0xc0013cd520)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:483 +0x34
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc0013cd520)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:94 +0x45
testing.tRunner(0xc0013cd520, 0xc001f361c0)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2352
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3507 [chan receive, 9 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001bd4340, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3498
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2928 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2927
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2373 [chan receive, 51 minutes]:
testing.(*testContext).waitParallel(0xc000724780)
	/usr/local/go/src/testing/testing.go:1817 +0xac
testing.(*T).Parallel(0xc0013cda00)
	/usr/local/go/src/testing/testing.go:1484 +0x229
k8s.io/minikube/test/integration.MaybeParallel(0xc0013cda00)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:483 +0x34
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc0013cda00)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:94 +0x45
testing.tRunner(0xc0013cda00, 0xc001f36280)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2352
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 2903 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001b85980)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2922
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3381 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0xc0007faed0, 0x10)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc00215cfc0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc0007fb000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000847950, {0xc622c00, 0xc00052dd70}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000847950, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3368
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2710 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2709
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3367 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc00215d0e0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3366
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3502 [sync.Cond.Wait, 3 minutes]:
sync.runtime_notifyListWait(0xc001bd4310, 0x10)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc000b8ed20)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001bd4340)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000b45460, {0xc622c00, 0xc001f4e840}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000b45460, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3507
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2709 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000b73750, 0xc000b73798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0xfc?, 0xc000b73750, 0xc000b73798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0x993f016?, 0xc000b8c300?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000b8c300?, 0x993c525?, 0xc000222900?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2701
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3278 [chan receive, 12 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001bd4780, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3273
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3753 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001b848a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3749
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3156 [select]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001b114a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3150
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3383 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3382
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3172 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3171
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3170 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0xc00087ee50, 0x12)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001b11380)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc00087ee80)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001ca6570, {0xc622c00, 0xc001af47b0}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001ca6570, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3157
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3504 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3503
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3503 [select, 5 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc000507f50, 0xc000507f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x0?, 0xc000507f50, 0xc000507f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000507fd0?, 0x95339a4?, 0xc000b88660?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3507
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3880 [syscall]:
syscall.syscall6(0xc001f4ff80?, 0x1000000000010?, 0x10000000019?, 0x54f606c8?, 0x90?, 0xe2ed5b8?, 0x90?)
	/usr/local/go/src/runtime/sys_darwin.go:45 +0x98
syscall.wait4(0xc000c79b78?, 0x93ba0c5?, 0x90?, 0xc582f60?)
	/usr/local/go/src/syscall/zsyscall_darwin_amd64.go:44 +0x45
syscall.Wait4(0x94ea9e5?, 0xc000c79bac, 0x0?, 0x0?)
	/usr/local/go/src/syscall/syscall_bsd.go:144 +0x25
os.(*Process).wait(0xc001b3e960)
	/usr/local/go/src/os/exec_unix.go:43 +0x6d
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:134
os/exec.(*Cmd).Wait(0xc001871500)
	/usr/local/go/src/os/exec/exec.go:901 +0x45
os/exec.(*Cmd).Run(0xc001871500)
	/usr/local/go/src/os/exec/exec.go:608 +0x2d
k8s.io/minikube/test/integration.Run(0xc001bbad00, 0xc001871500)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:103 +0x1e5
k8s.io/minikube/test/integration.validateFirstStart({0xc646740?, 0xc000541880?}, 0xc001bbad00, {0xc0023975f0?, 0x115f8280?}, {0x115f8280015cc758?, 0xc0015cc760?}, {0x94ecd73?, 0x9444dcf?}, {0xc001bfb200, ...})
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:186 +0xd5
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0xc001bbad00)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:156 +0x66
testing.tRunner(0xc001bbad00, 0xc001633f00)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 3879
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3882 [IO wait]:
internal/poll.runtime_pollWait(0x5525d4c0, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc000b8f7a0?, 0xc0016f98e6?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc000b8f7a0, {0xc0016f98e6, 0x1e71a, 0x1e71a})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc001e14410, {0xc0016f98e6?, 0xc0015cc548?, 0x1fe32?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc001f4f5c0, {0xc621618, 0xc00009b278})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xc621758, 0xc001f4f5c0}, {0xc621618, 0xc00009b278}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x10000000d8dc980?, {0xc621758, 0xc001f4f5c0})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xf?, {0xc621758?, 0xc001f4f5c0?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xc621758, 0xc001f4f5c0}, {0xc6216d8, 0xc001e14410}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:578 +0x34
os/exec.(*Cmd).Start.func2(0xc001633f00?)
	/usr/local/go/src/os/exec/exec.go:728 +0x2c
created by os/exec.(*Cmd).Start in goroutine 3880
	/usr/local/go/src/os/exec/exec.go:727 +0x9ae

                                                
                                                
goroutine 3824 [syscall, 7 minutes]:
syscall.syscall6(0xc001891f80?, 0x1000000000010?, 0x10000000019?, 0x54f606c8?, 0x90?, 0xe2ed108?, 0x90?)
	/usr/local/go/src/runtime/sys_darwin.go:45 +0x98
syscall.wait4(0xc000c7eb48?, 0x93ba0c5?, 0x90?, 0xc582f60?)
	/usr/local/go/src/syscall/zsyscall_darwin_amd64.go:44 +0x45
syscall.Wait4(0x94ea9e5?, 0xc000c7eb7c, 0x0?, 0x0?)
	/usr/local/go/src/syscall/syscall_bsd.go:144 +0x25
os.(*Process).wait(0xc0016812c0)
	/usr/local/go/src/os/exec_unix.go:43 +0x6d
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:134
os/exec.(*Cmd).Wait(0xc001b46480)
	/usr/local/go/src/os/exec/exec.go:901 +0x45
os/exec.(*Cmd).Run(0xc001b46480)
	/usr/local/go/src/os/exec/exec.go:608 +0x2d
k8s.io/minikube/test/integration.Run(0xc001bbb380, 0xc001b46480)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:103 +0x1e5
k8s.io/minikube/test/integration.validateSecondStart({0xc646740, 0xc00073dc70}, 0xc001bbb380, {0xc001624dc8, 0x16}, {0x2acdd32000b6bf58?, 0xc000b6bf60?}, {0x94ecd73?, 0x9444dcf?}, {0xc00145ad80, ...})
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:256 +0xe5
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0xc001bbb380)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:156 +0x66
testing.tRunner(0xc001bbb380, 0xc001633100)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 3592
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3781 [select]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xc646900, 0xc000058ba0}, 0xc0017a0f50, 0xc0017a0f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xc646900, 0xc000058ba0}, 0x40?, 0xc0017a0f50, 0xc0017a0f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xc646900?, 0xc000058ba0?}, 0x993f016?, 0xc001871200?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc0017a0fd0?, 0x95339a4?, 0xc00188f440?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3754
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3621 [select, 3 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3620
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3613 [chan receive, 9 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc00087fdc0, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3608
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3730 [sync.Cond.Wait]:
sync.runtime_notifyListWait(0xc00071df10, 0x1)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xc10b760?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0016183c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc000149040)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0007c5870, {0xc622c00, 0xc001ea49c0}, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0007c5870, 0x3b9aca00, 0x0, 0x1, 0xc000058ba0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.3/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3721
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.3/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3827 [select, 7 minutes]:
os/exec.(*Cmd).watchCtx(0xc001b46480, 0xc0018556e0)
	/usr/local/go/src/os/exec/exec.go:768 +0xb5
created by os/exec.(*Cmd).Start in goroutine 3824
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                                
goroutine 3879 [chan receive]:
testing.(*T).Run(0xc001bba680, {0xaf6465f?, 0x60400000004?}, 0xc001633f00)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0xc001bba680)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:155 +0x2af
testing.tRunner(0xc001bba680, 0xc001633e80)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2374
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3881 [IO wait]:
internal/poll.runtime_pollWait(0x5525cfe8, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc000b8f6e0?, 0xc001365b0b?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc000b8f6e0, {0xc001365b0b, 0x4f5, 0x4f5})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc001e143e0, {0xc001365b0b?, 0xc001b73880?, 0x230?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc001f4f590, {0xc621618, 0xc00009b268})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xc621758, 0xc001f4f590}, {0xc621618, 0xc00009b268}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0xc0015cce78?, {0xc621758, 0xc001f4f590})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xc0015ccf38?, {0xc621758?, 0xc001f4f590?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xc621758, 0xc001f4f590}, {0xc6216d8, 0xc001e143e0}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:578 +0x34
os/exec.(*Cmd).Start.func2(0xc00010eea0?)
	/usr/local/go/src/os/exec/exec.go:728 +0x2c
created by os/exec.(*Cmd).Start in goroutine 3880
	/usr/local/go/src/os/exec/exec.go:727 +0x9ae

                                                
                                                
goroutine 3883 [select]:
os/exec.(*Cmd).watchCtx(0xc001871500, 0xc00056a840)
	/usr/local/go/src/os/exec/exec.go:768 +0xb5
created by os/exec.(*Cmd).Start in goroutine 3880
	/usr/local/go/src/os/exec/exec.go:754 +0x976

                                                
                                    

Test pass (182/227)

Order passed test Duration
3 TestDownloadOnly/v1.20.0/json-events 17.2
4 TestDownloadOnly/v1.20.0/preload-exists 0
7 TestDownloadOnly/v1.20.0/kubectl 0
8 TestDownloadOnly/v1.20.0/LogsDuration 0.29
9 TestDownloadOnly/v1.20.0/DeleteAll 0.24
10 TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds 0.21
12 TestDownloadOnly/v1.30.3/json-events 7.49
13 TestDownloadOnly/v1.30.3/preload-exists 0
16 TestDownloadOnly/v1.30.3/kubectl 0
17 TestDownloadOnly/v1.30.3/LogsDuration 0.29
18 TestDownloadOnly/v1.30.3/DeleteAll 0.23
19 TestDownloadOnly/v1.30.3/DeleteAlwaysSucceeds 0.21
21 TestDownloadOnly/v1.31.0-beta.0/json-events 8.78
22 TestDownloadOnly/v1.31.0-beta.0/preload-exists 0
25 TestDownloadOnly/v1.31.0-beta.0/kubectl 0
26 TestDownloadOnly/v1.31.0-beta.0/LogsDuration 0.3
27 TestDownloadOnly/v1.31.0-beta.0/DeleteAll 0.23
28 TestDownloadOnly/v1.31.0-beta.0/DeleteAlwaysSucceeds 0.21
30 TestBinaryMirror 0.97
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.21
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.19
36 TestAddons/Setup 214.74
38 TestAddons/serial/Volcano 40.28
40 TestAddons/serial/GCPAuth/Namespaces 0.1
42 TestAddons/parallel/Registry 15.02
43 TestAddons/parallel/Ingress 19.76
44 TestAddons/parallel/InspektorGadget 10.54
45 TestAddons/parallel/MetricsServer 6.49
46 TestAddons/parallel/HelmTiller 10.14
48 TestAddons/parallel/CSI 44.19
49 TestAddons/parallel/Headlamp 17.46
50 TestAddons/parallel/CloudSpanner 5.37
51 TestAddons/parallel/LocalPath 55.44
52 TestAddons/parallel/NvidiaDevicePlugin 5.32
53 TestAddons/parallel/Yakd 10.45
54 TestAddons/StoppedEnableDisable 5.91
62 TestHyperKitDriverInstallOrUpdate 15.4
65 TestErrorSpam/setup 37.13
66 TestErrorSpam/start 1.53
67 TestErrorSpam/status 0.48
68 TestErrorSpam/pause 1.35
69 TestErrorSpam/unpause 1.34
70 TestErrorSpam/stop 153.83
73 TestFunctional/serial/CopySyncFile 0
74 TestFunctional/serial/StartWithProxy 52.65
75 TestFunctional/serial/AuditLog 0
76 TestFunctional/serial/SoftStart 36.13
77 TestFunctional/serial/KubeContext 0.04
78 TestFunctional/serial/KubectlGetPods 0.07
81 TestFunctional/serial/CacheCmd/cache/add_remote 3.01
82 TestFunctional/serial/CacheCmd/cache/add_local 1.31
83 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.08
84 TestFunctional/serial/CacheCmd/cache/list 0.08
85 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.18
86 TestFunctional/serial/CacheCmd/cache/cache_reload 1.01
87 TestFunctional/serial/CacheCmd/cache/delete 0.17
88 TestFunctional/serial/MinikubeKubectlCmd 1.17
89 TestFunctional/serial/MinikubeKubectlCmdDirectly 1.45
90 TestFunctional/serial/ExtraConfig 39.55
91 TestFunctional/serial/ComponentHealth 0.05
92 TestFunctional/serial/LogsCmd 2.79
93 TestFunctional/serial/LogsFileCmd 2.81
94 TestFunctional/serial/InvalidService 3.67
96 TestFunctional/parallel/ConfigCmd 0.49
97 TestFunctional/parallel/DashboardCmd 12.12
98 TestFunctional/parallel/DryRun 1.2
99 TestFunctional/parallel/InternationalLanguage 0.55
100 TestFunctional/parallel/StatusCmd 0.51
104 TestFunctional/parallel/ServiceCmdConnect 12.36
105 TestFunctional/parallel/AddonsCmd 0.22
106 TestFunctional/parallel/PersistentVolumeClaim 27.5
108 TestFunctional/parallel/SSHCmd 0.28
109 TestFunctional/parallel/CpCmd 0.9
110 TestFunctional/parallel/MySQL 24.23
111 TestFunctional/parallel/FileSync 0.15
112 TestFunctional/parallel/CertSync 0.91
116 TestFunctional/parallel/NodeLabels 0.05
118 TestFunctional/parallel/NonActiveRuntimeDisabled 0.17
120 TestFunctional/parallel/License 0.49
122 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.36
123 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
125 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 10.14
126 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
127 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
128 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.04
129 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.03
130 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.02
131 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
132 TestFunctional/parallel/ServiceCmd/DeployApp 6.11
133 TestFunctional/parallel/ProfileCmd/profile_not_create 0.24
134 TestFunctional/parallel/ProfileCmd/profile_list 0.25
135 TestFunctional/parallel/ProfileCmd/profile_json_output 0.25
136 TestFunctional/parallel/MountCmd/any-port 5.9
137 TestFunctional/parallel/ServiceCmd/List 0.38
138 TestFunctional/parallel/ServiceCmd/JSONOutput 0.37
139 TestFunctional/parallel/ServiceCmd/HTTPS 0.24
140 TestFunctional/parallel/ServiceCmd/Format 0.24
141 TestFunctional/parallel/ServiceCmd/URL 0.25
143 TestFunctional/parallel/Version/short 0.1
144 TestFunctional/parallel/Version/components 0.37
145 TestFunctional/parallel/ImageCommands/ImageListShort 0.15
146 TestFunctional/parallel/ImageCommands/ImageListTable 0.16
147 TestFunctional/parallel/ImageCommands/ImageListJson 0.16
148 TestFunctional/parallel/ImageCommands/ImageListYaml 0.17
149 TestFunctional/parallel/ImageCommands/ImageBuild 2.68
150 TestFunctional/parallel/ImageCommands/Setup 1.83
151 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 0.92
152 TestFunctional/parallel/MountCmd/VerifyCleanup 1.7
153 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 1.41
154 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 1.42
155 TestFunctional/parallel/DockerEnv/bash 0.6
156 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.29
157 TestFunctional/parallel/ImageCommands/ImageRemove 0.31
158 TestFunctional/parallel/UpdateContextCmd/no_changes 0.19
159 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.21
160 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.19
161 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 0.52
162 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.35
163 TestFunctional/delete_echo-server_images 0.04
164 TestFunctional/delete_my-image_image 0.02
165 TestFunctional/delete_minikube_cached_images 0.02
169 TestMultiControlPlane/serial/StartCluster 210.89
170 TestMultiControlPlane/serial/DeployApp 6.24
171 TestMultiControlPlane/serial/PingHostFromPods 1.24
173 TestMultiControlPlane/serial/NodeLabels 0.05
174 TestMultiControlPlane/serial/HAppyAfterClusterStart 0.34
177 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.27
179 TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart 0.34
190 TestImageBuild/serial/Setup 37.14
191 TestImageBuild/serial/NormalBuild 1.67
192 TestImageBuild/serial/BuildWithBuildArg 0.79
193 TestImageBuild/serial/BuildWithDockerIgnore 0.7
194 TestImageBuild/serial/BuildWithSpecifiedDockerfile 0.59
198 TestJSONOutput/start/Command 459.48
199 TestJSONOutput/start/Audit 0
204 TestJSONOutput/pause/Command 0.48
205 TestJSONOutput/pause/Audit 0
207 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
208 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
210 TestJSONOutput/unpause/Command 0.45
211 TestJSONOutput/unpause/Audit 0
213 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
214 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
216 TestJSONOutput/stop/Command 8.34
217 TestJSONOutput/stop/Audit 0
219 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
220 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
221 TestErrorJSONOutput 0.58
226 TestMainNoArgs 0.08
227 TestMinikubeProfile 93.1
233 TestMultiNode/serial/FreshStart2Nodes 112.69
234 TestMultiNode/serial/DeployApp2Nodes 5.63
235 TestMultiNode/serial/PingHostFrom2Pods 0.87
236 TestMultiNode/serial/AddNode 45.47
237 TestMultiNode/serial/MultiNodeLabels 0.05
238 TestMultiNode/serial/ProfileList 0.18
239 TestMultiNode/serial/CopyFile 5.33
240 TestMultiNode/serial/StopNode 2.86
241 TestMultiNode/serial/StartAfterStop 41.81
242 TestMultiNode/serial/RestartKeepsNodes 140.28
243 TestMultiNode/serial/DeleteNode 3.44
244 TestMultiNode/serial/StopMultiNode 16.8
245 TestMultiNode/serial/RestartMultiNode 107.06
246 TestMultiNode/serial/ValidateNameConflict 43.98
250 TestPreload 277.28
253 TestSkaffold 113.18
256 TestRunningBinaryUpgrade 74.2
258 TestKubernetesUpgrade 204.08
271 TestStoppedBinaryUpgrade/Setup 1.85
272 TestStoppedBinaryUpgrade/Upgrade 110.04
273 TestStoppedBinaryUpgrade/MinikubeLogs 2.7
284 TestNoKubernetes/serial/StartNoK8sWithVersion 0.46
285 TestNoKubernetes/serial/StartWithK8s 86.19
286 TestNoKubernetes/serial/StartWithStopK8s 56.95
287 TestNoKubernetes/serial/Start 73.39
288 TestNoKubernetes/serial/VerifyK8sNotRunning 0.12
289 TestNoKubernetes/serial/ProfileList 0.37
290 TestNoKubernetes/serial/Stop 2.35
291 TestNoKubernetes/serial/StartNoArgs 75.49
292 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 4.08
293 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 7.65
294 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.17
x
+
TestDownloadOnly/v1.20.0/json-events (17.2s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-822000 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-822000 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=docker --driver=hyperkit : (17.196937408s)
--- PASS: TestDownloadOnly/v1.20.0/json-events (17.20s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/preload-exists
--- PASS: TestDownloadOnly/v1.20.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/kubectl
--- PASS: TestDownloadOnly/v1.20.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-822000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-822000: exit status 85 (290.333808ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-822000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT |          |
	|         | -p download-only-822000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 09:39:14
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 09:39:14.978307    1593 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:39:14.978492    1593 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:39:14.978499    1593 out.go:304] Setting ErrFile to fd 2...
	I0731 09:39:14.978503    1593 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:39:14.978674    1593 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	W0731 09:39:14.978774    1593 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/19349-1046/.minikube/config/config.json: open /Users/jenkins/minikube-integration/19349-1046/.minikube/config/config.json: no such file or directory
	I0731 09:39:14.980678    1593 out.go:298] Setting JSON to true
	I0731 09:39:15.004006    1593 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":524,"bootTime":1722443430,"procs":448,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:39:15.004087    1593 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:39:15.025684    1593 out.go:97] [download-only-822000] minikube v1.33.1 on Darwin 14.5
	I0731 09:39:15.025869    1593 notify.go:220] Checking for updates...
	W0731 09:39:15.025903    1593 preload.go:293] Failed to list preload files: open /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball: no such file or directory
	I0731 09:39:15.047434    1593 out.go:169] MINIKUBE_LOCATION=19349
	I0731 09:39:15.068772    1593 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:39:15.090728    1593 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:39:15.112955    1593 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:39:15.134751    1593 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	W0731 09:39:15.176705    1593 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0731 09:39:15.177230    1593 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:39:15.227850    1593 out.go:97] Using the hyperkit driver based on user configuration
	I0731 09:39:15.227918    1593 start.go:297] selected driver: hyperkit
	I0731 09:39:15.227932    1593 start.go:901] validating driver "hyperkit" against <nil>
	I0731 09:39:15.228155    1593 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:39:15.228619    1593 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 09:39:15.637610    1593 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 09:39:15.642964    1593 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:39:15.642987    1593 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 09:39:15.643016    1593 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 09:39:15.647354    1593 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0731 09:39:15.647894    1593 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0731 09:39:15.647949    1593 cni.go:84] Creating CNI manager for ""
	I0731 09:39:15.647967    1593 cni.go:162] CNI unnecessary in this configuration, recommending no CNI
	I0731 09:39:15.648048    1593 start.go:340] cluster config:
	{Name:download-only-822000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:6000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.20.0 ClusterName:download-only-822000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.20.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:39:15.648279    1593 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:39:15.669681    1593 out.go:97] Downloading VM boot image ...
	I0731 09:39:15.669784    1593 download.go:107] Downloading: https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/iso/amd64/minikube-v1.33.1-1722248113-19339-amd64.iso
	I0731 09:39:23.693098    1593 out.go:97] Starting "download-only-822000" primary control-plane node in "download-only-822000" cluster
	I0731 09:39:23.693136    1593 preload.go:131] Checking if preload exists for k8s version v1.20.0 and runtime docker
	I0731 09:39:23.744279    1593 preload.go:118] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4
	I0731 09:39:23.744349    1593 cache.go:56] Caching tarball of preloaded images
	I0731 09:39:23.744757    1593 preload.go:131] Checking if preload exists for k8s version v1.20.0 and runtime docker
	I0731 09:39:23.766418    1593 out.go:97] Downloading Kubernetes v1.20.0 preload ...
	I0731 09:39:23.766439    1593 preload.go:236] getting checksum for preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4 ...
	I0731 09:39:23.841452    1593 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4?checksum=md5:9a82241e9b8b4ad2b5cca73108f2c7a3 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4
	
	
	* The control-plane node download-only-822000 host does not exist
	  To start a cluster, run: "minikube start -p download-only-822000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.20.0/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAll (0.24s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.20.0/DeleteAll (0.24s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-822000
--- PASS: TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.3/json-events (7.49s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.3/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-751000 --force --alsologtostderr --kubernetes-version=v1.30.3 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-751000 --force --alsologtostderr --kubernetes-version=v1.30.3 --container-runtime=docker --driver=hyperkit : (7.485652406s)
--- PASS: TestDownloadOnly/v1.30.3/json-events (7.49s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.3/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.3/preload-exists
--- PASS: TestDownloadOnly/v1.30.3/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.3/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.3/kubectl
--- PASS: TestDownloadOnly/v1.30.3/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.3/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.3/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-751000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-751000: exit status 85 (294.163332ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only        | download-only-822000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT |                     |
	|         | -p download-only-822000        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |                     |
	|         | --container-runtime=docker     |                      |         |         |                     |                     |
	|         | --driver=hyperkit              |                      |         |         |                     |                     |
	| delete  | --all                          | minikube             | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT | 31 Jul 24 09:39 PDT |
	| delete  | -p download-only-822000        | download-only-822000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT | 31 Jul 24 09:39 PDT |
	| start   | -o=json --download-only        | download-only-751000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT |                     |
	|         | -p download-only-751000        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.30.3   |                      |         |         |                     |                     |
	|         | --container-runtime=docker     |                      |         |         |                     |                     |
	|         | --driver=hyperkit              |                      |         |         |                     |                     |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 09:39:32
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 09:39:32.913200    1617 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:39:32.913378    1617 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:39:32.913384    1617 out.go:304] Setting ErrFile to fd 2...
	I0731 09:39:32.913388    1617 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:39:32.913564    1617 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:39:32.915028    1617 out.go:298] Setting JSON to true
	I0731 09:39:32.939507    1617 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":542,"bootTime":1722443430,"procs":448,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:39:32.939596    1617 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:39:32.960989    1617 out.go:97] [download-only-751000] minikube v1.33.1 on Darwin 14.5
	I0731 09:39:32.961161    1617 notify.go:220] Checking for updates...
	I0731 09:39:32.981972    1617 out.go:169] MINIKUBE_LOCATION=19349
	I0731 09:39:33.003141    1617 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:39:33.023934    1617 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:39:33.045305    1617 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:39:33.067271    1617 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	W0731 09:39:33.109042    1617 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0731 09:39:33.109444    1617 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:39:33.139073    1617 out.go:97] Using the hyperkit driver based on user configuration
	I0731 09:39:33.139116    1617 start.go:297] selected driver: hyperkit
	I0731 09:39:33.139126    1617 start.go:901] validating driver "hyperkit" against <nil>
	I0731 09:39:33.139300    1617 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:39:33.139472    1617 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 09:39:33.148912    1617 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 09:39:33.153104    1617 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:39:33.153126    1617 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 09:39:33.153150    1617 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 09:39:33.155879    1617 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0731 09:39:33.156036    1617 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0731 09:39:33.156090    1617 cni.go:84] Creating CNI manager for ""
	I0731 09:39:33.156111    1617 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0731 09:39:33.156126    1617 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0731 09:39:33.156197    1617 start.go:340] cluster config:
	{Name:download-only-751000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:6000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.3 ClusterName:download-only-751000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:39:33.156289    1617 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:39:33.176943    1617 out.go:97] Starting "download-only-751000" primary control-plane node in "download-only-751000" cluster
	I0731 09:39:33.176971    1617 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:39:33.235593    1617 preload.go:118] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.30.3/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	I0731 09:39:33.235615    1617 cache.go:56] Caching tarball of preloaded images
	I0731 09:39:33.235901    1617 preload.go:131] Checking if preload exists for k8s version v1.30.3 and runtime docker
	I0731 09:39:33.256837    1617 out.go:97] Downloading Kubernetes v1.30.3 preload ...
	I0731 09:39:33.256858    1617 preload.go:236] getting checksum for preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4 ...
	I0731 09:39:33.333586    1617 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.30.3/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4?checksum=md5:6304692df2fe6f7b0bdd7f93d160be8c -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.3-docker-overlay2-amd64.tar.lz4
	
	
	* The control-plane node download-only-751000 host does not exist
	  To start a cluster, run: "minikube start -p download-only-751000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.30.3/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.3/DeleteAll (0.23s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.3/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.30.3/DeleteAll (0.23s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.3/DeleteAlwaysSucceeds (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.3/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-751000
--- PASS: TestDownloadOnly/v1.30.3/DeleteAlwaysSucceeds (0.21s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/json-events (8.78s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-900000 --force --alsologtostderr --kubernetes-version=v1.31.0-beta.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-900000 --force --alsologtostderr --kubernetes-version=v1.31.0-beta.0 --container-runtime=docker --driver=hyperkit : (8.784384372s)
--- PASS: TestDownloadOnly/v1.31.0-beta.0/json-events (8.78s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/preload-exists
--- PASS: TestDownloadOnly/v1.31.0-beta.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/kubectl
--- PASS: TestDownloadOnly/v1.31.0-beta.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/LogsDuration (0.3s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-900000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-900000: exit status 85 (301.607386ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|-------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |                Args                 |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|-------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only             | download-only-822000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT |                     |
	|         | -p download-only-822000             |                      |         |         |                     |                     |
	|         | --force --alsologtostderr           |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0        |                      |         |         |                     |                     |
	|         | --container-runtime=docker          |                      |         |         |                     |                     |
	|         | --driver=hyperkit                   |                      |         |         |                     |                     |
	| delete  | --all                               | minikube             | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT | 31 Jul 24 09:39 PDT |
	| delete  | -p download-only-822000             | download-only-822000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT | 31 Jul 24 09:39 PDT |
	| start   | -o=json --download-only             | download-only-751000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT |                     |
	|         | -p download-only-751000             |                      |         |         |                     |                     |
	|         | --force --alsologtostderr           |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.30.3        |                      |         |         |                     |                     |
	|         | --container-runtime=docker          |                      |         |         |                     |                     |
	|         | --driver=hyperkit                   |                      |         |         |                     |                     |
	| delete  | --all                               | minikube             | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT | 31 Jul 24 09:39 PDT |
	| delete  | -p download-only-751000             | download-only-751000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT | 31 Jul 24 09:39 PDT |
	| start   | -o=json --download-only             | download-only-900000 | jenkins | v1.33.1 | 31 Jul 24 09:39 PDT |                     |
	|         | -p download-only-900000             |                      |         |         |                     |                     |
	|         | --force --alsologtostderr           |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.31.0-beta.0 |                      |         |         |                     |                     |
	|         | --container-runtime=docker          |                      |         |         |                     |                     |
	|         | --driver=hyperkit                   |                      |         |         |                     |                     |
	|---------|-------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/31 09:39:41
	Running on machine: MacOS-Agent-4
	Binary: Built with gc go1.22.5 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0731 09:39:41.137580    1641 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:39:41.137842    1641 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:39:41.137848    1641 out.go:304] Setting ErrFile to fd 2...
	I0731 09:39:41.137851    1641 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:39:41.138023    1641 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:39:41.139459    1641 out.go:298] Setting JSON to true
	I0731 09:39:41.165107    1641 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":551,"bootTime":1722443430,"procs":448,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:39:41.165193    1641 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:39:41.186760    1641 out.go:97] [download-only-900000] minikube v1.33.1 on Darwin 14.5
	I0731 09:39:41.186941    1641 notify.go:220] Checking for updates...
	I0731 09:39:41.207724    1641 out.go:169] MINIKUBE_LOCATION=19349
	I0731 09:39:41.228581    1641 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:39:41.249811    1641 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:39:41.270752    1641 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:39:41.291906    1641 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	W0731 09:39:41.333705    1641 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0731 09:39:41.334112    1641 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:39:41.363922    1641 out.go:97] Using the hyperkit driver based on user configuration
	I0731 09:39:41.363980    1641 start.go:297] selected driver: hyperkit
	I0731 09:39:41.363995    1641 start.go:901] validating driver "hyperkit" against <nil>
	I0731 09:39:41.364244    1641 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:39:41.364506    1641 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/19349-1046/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0731 09:39:41.374290    1641 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0731 09:39:41.378336    1641 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:39:41.378357    1641 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0731 09:39:41.378383    1641 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0731 09:39:41.381187    1641 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0731 09:39:41.381336    1641 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0731 09:39:41.381389    1641 cni.go:84] Creating CNI manager for ""
	I0731 09:39:41.381414    1641 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0731 09:39:41.381428    1641 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0731 09:39:41.381490    1641 start.go:340] cluster config:
	{Name:download-only-900000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:6000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0-beta.0 ClusterName:download-only-900000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster
.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0-beta.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:39:41.381574    1641 iso.go:125] acquiring lock: {Name:mk76e1f83f7adfb3eef322e581368b6ffecd9831 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0731 09:39:41.402791    1641 out.go:97] Starting "download-only-900000" primary control-plane node in "download-only-900000" cluster
	I0731 09:39:41.402832    1641 preload.go:131] Checking if preload exists for k8s version v1.31.0-beta.0 and runtime docker
	I0731 09:39:41.457856    1641 preload.go:118] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.31.0-beta.0/preloaded-images-k8s-v18-v1.31.0-beta.0-docker-overlay2-amd64.tar.lz4
	I0731 09:39:41.457887    1641 cache.go:56] Caching tarball of preloaded images
	I0731 09:39:41.458203    1641 preload.go:131] Checking if preload exists for k8s version v1.31.0-beta.0 and runtime docker
	I0731 09:39:41.479607    1641 out.go:97] Downloading Kubernetes v1.31.0-beta.0 preload ...
	I0731 09:39:41.479626    1641 preload.go:236] getting checksum for preloaded-images-k8s-v18-v1.31.0-beta.0-docker-overlay2-amd64.tar.lz4 ...
	I0731 09:39:41.557424    1641 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.31.0-beta.0/preloaded-images-k8s-v18-v1.31.0-beta.0-docker-overlay2-amd64.tar.lz4?checksum=md5:181d3c061f7abe363e688bf9ac3c9580 -> /Users/jenkins/minikube-integration/19349-1046/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-beta.0-docker-overlay2-amd64.tar.lz4
	
	
	* The control-plane node download-only-900000 host does not exist
	  To start a cluster, run: "minikube start -p download-only-900000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.31.0-beta.0/LogsDuration (0.30s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/DeleteAll (0.23s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.31.0-beta.0/DeleteAll (0.23s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/DeleteAlwaysSucceeds (0.21s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-900000
--- PASS: TestDownloadOnly/v1.31.0-beta.0/DeleteAlwaysSucceeds (0.21s)

                                                
                                    
x
+
TestBinaryMirror (0.97s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-235000 --alsologtostderr --binary-mirror http://127.0.0.1:49641 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-235000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-235000
--- PASS: TestBinaryMirror (0.97s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.21s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1037: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-937000
addons_test.go:1037: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable dashboard -p addons-937000: exit status 85 (206.430014ms)

                                                
                                                
-- stdout --
	* Profile "addons-937000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-937000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.21s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.19s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1048: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-937000
addons_test.go:1048: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons disable dashboard -p addons-937000: exit status 85 (186.876154ms)

                                                
                                                
-- stdout --
	* Profile "addons-937000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-937000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.19s)

                                                
                                    
x
+
TestAddons/Setup (214.74s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-937000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --addons=volcano --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:110: (dbg) Done: out/minikube-darwin-amd64 start -p addons-937000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --addons=volcano --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (3m34.741592612s)
--- PASS: TestAddons/Setup (214.74s)

                                                
                                    
x
+
TestAddons/serial/Volcano (40.28s)

                                                
                                                
=== RUN   TestAddons/serial/Volcano
addons_test.go:905: volcano-admission stabilized in 11.281505ms
addons_test.go:897: volcano-scheduler stabilized in 11.37107ms
addons_test.go:913: volcano-controller stabilized in 11.381356ms
addons_test.go:919: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-scheduler-844f6db89b-mjq7t" [b9317894-b315-4c82-bc8b-8fa3818bd1d6] Running
addons_test.go:919: (dbg) TestAddons/serial/Volcano: app=volcano-scheduler healthy within 5.00338477s
addons_test.go:923: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-admission-5f7844f7bc-vvn6b" [92aa94b6-632c-4ca2-8bfc-47ee57d71976] Running
addons_test.go:923: (dbg) TestAddons/serial/Volcano: app=volcano-admission healthy within 5.002167685s
addons_test.go:927: (dbg) TestAddons/serial/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-controllers-59cb4746db-cgz9t" [1b8b84da-83cb-47e2-ace9-a0a321ded132] Running
addons_test.go:927: (dbg) TestAddons/serial/Volcano: app=volcano-controller healthy within 5.002339429s
addons_test.go:932: (dbg) Run:  kubectl --context addons-937000 delete -n volcano-system job volcano-admission-init
addons_test.go:938: (dbg) Run:  kubectl --context addons-937000 create -f testdata/vcjob.yaml
addons_test.go:946: (dbg) Run:  kubectl --context addons-937000 get vcjob -n my-volcano
addons_test.go:964: (dbg) TestAddons/serial/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:344: "test-job-nginx-0" [0dc850bd-26ed-446e-8ee8-5ba361cdb091] Pending
helpers_test.go:344: "test-job-nginx-0" [0dc850bd-26ed-446e-8ee8-5ba361cdb091] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "test-job-nginx-0" [0dc850bd-26ed-446e-8ee8-5ba361cdb091] Running
addons_test.go:964: (dbg) TestAddons/serial/Volcano: volcano.sh/job-name=test-job healthy within 15.004791888s
addons_test.go:968: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable volcano --alsologtostderr -v=1
addons_test.go:968: (dbg) Done: out/minikube-darwin-amd64 -p addons-937000 addons disable volcano --alsologtostderr -v=1: (9.989360712s)
--- PASS: TestAddons/serial/Volcano (40.28s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.1s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:656: (dbg) Run:  kubectl --context addons-937000 create ns new-namespace
addons_test.go:670: (dbg) Run:  kubectl --context addons-937000 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.10s)

                                                
                                    
x
+
TestAddons/parallel/Registry (15.02s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:332: registry stabilized in 1.982529ms
addons_test.go:334: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-698f998955-9bhgf" [710bd2a2-d7ad-446e-81ab-db873fee78e2] Running
addons_test.go:334: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 6.003655818s
addons_test.go:337: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-proxy-5zckr" [8d9729df-8e6e-46ed-b2e2-b18ef032de82] Running
addons_test.go:337: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.003088514s
addons_test.go:342: (dbg) Run:  kubectl --context addons-937000 delete po -l run=registry-test --now
addons_test.go:347: (dbg) Run:  kubectl --context addons-937000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:347: (dbg) Done: kubectl --context addons-937000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (3.377187855s)
addons_test.go:361: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 ip
2024/07/31 09:44:38 [DEBUG] GET http://192.169.0.2:5000
addons_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (15.02s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (19.76s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:209: (dbg) Run:  kubectl --context addons-937000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:234: (dbg) Run:  kubectl --context addons-937000 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:247: (dbg) Run:  kubectl --context addons-937000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [8e8367aa-b325-4560-8651-adab1be8627d] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [8e8367aa-b325-4560-8651-adab1be8627d] Running
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 10.004361588s
addons_test.go:264: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:288: (dbg) Run:  kubectl --context addons-937000 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:293: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 ip
addons_test.go:299: (dbg) Run:  nslookup hello-john.test 192.169.0.2
addons_test.go:308: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:308: (dbg) Done: out/minikube-darwin-amd64 -p addons-937000 addons disable ingress-dns --alsologtostderr -v=1: (1.406541873s)
addons_test.go:313: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable ingress --alsologtostderr -v=1
addons_test.go:313: (dbg) Done: out/minikube-darwin-amd64 -p addons-937000 addons disable ingress --alsologtostderr -v=1: (7.437830491s)
--- PASS: TestAddons/parallel/Ingress (19.76s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (10.54s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:848: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:344: "gadget-w7j48" [0695c28f-5814-466f-aa9f-3a40e05cb066] Running / Ready:ContainersNotReady (containers with unready status: [gadget]) / ContainersReady:ContainersNotReady (containers with unready status: [gadget])
addons_test.go:848: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.003372203s
addons_test.go:851: (dbg) Run:  out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-937000
addons_test.go:851: (dbg) Done: out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-937000: (5.532874118s)
--- PASS: TestAddons/parallel/InspektorGadget (10.54s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (6.49s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:409: metrics-server stabilized in 1.545207ms
addons_test.go:411: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:344: "metrics-server-c59844bb4-g2th2" [ca21c49f-9068-4a59-9e76-4e8e4b1b00ed] Running
addons_test.go:411: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 6.004078627s
addons_test.go:417: (dbg) Run:  kubectl --context addons-937000 top pods -n kube-system
addons_test.go:434: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (6.49s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (10.14s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:458: tiller-deploy stabilized in 1.5779ms
addons_test.go:460: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:344: "tiller-deploy-6677d64bcd-g4jl5" [6874fda2-5fd6-48f4-86f1-b288f216d53e] Running
addons_test.go:460: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.003218018s
addons_test.go:475: (dbg) Run:  kubectl --context addons-937000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:475: (dbg) Done: kubectl --context addons-937000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version: (4.706682006s)
addons_test.go:492: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (10.14s)

                                                
                                    
x
+
TestAddons/parallel/CSI (44.19s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:567: csi-hostpath-driver pods stabilized in 4.387982ms
addons_test.go:570: (dbg) Run:  kubectl --context addons-937000 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:575: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:580: (dbg) Run:  kubectl --context addons-937000 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:585: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:344: "task-pv-pod" [389d25c7-1357-4bd1-9211-89cc68ce0cd2] Pending
helpers_test.go:344: "task-pv-pod" [389d25c7-1357-4bd1-9211-89cc68ce0cd2] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod" [389d25c7-1357-4bd1-9211-89cc68ce0cd2] Running
addons_test.go:585: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 8.002794851s
addons_test.go:590: (dbg) Run:  kubectl --context addons-937000 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:595: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:419: (dbg) Run:  kubectl --context addons-937000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:419: (dbg) Run:  kubectl --context addons-937000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:600: (dbg) Run:  kubectl --context addons-937000 delete pod task-pv-pod
addons_test.go:606: (dbg) Run:  kubectl --context addons-937000 delete pvc hpvc
addons_test.go:612: (dbg) Run:  kubectl --context addons-937000 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:617: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:622: (dbg) Run:  kubectl --context addons-937000 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:627: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:344: "task-pv-pod-restore" [57ae3875-3cea-4263-a6dc-0d04de4248a7] Pending
helpers_test.go:344: "task-pv-pod-restore" [57ae3875-3cea-4263-a6dc-0d04de4248a7] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod-restore" [57ae3875-3cea-4263-a6dc-0d04de4248a7] Running
addons_test.go:627: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 7.003798507s
addons_test.go:632: (dbg) Run:  kubectl --context addons-937000 delete pod task-pv-pod-restore
addons_test.go:636: (dbg) Run:  kubectl --context addons-937000 delete pvc hpvc-restore
addons_test.go:640: (dbg) Run:  kubectl --context addons-937000 delete volumesnapshot new-snapshot-demo
addons_test.go:644: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:644: (dbg) Done: out/minikube-darwin-amd64 -p addons-937000 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.440872139s)
addons_test.go:648: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (44.19s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (17.46s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:830: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-937000 --alsologtostderr -v=1
addons_test.go:835: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:344: "headlamp-9d868696f-42rts" [869203fe-c7ae-49bf-9fa8-8b02f39b94f8] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:344: "headlamp-9d868696f-42rts" [869203fe-c7ae-49bf-9fa8-8b02f39b94f8] Running
addons_test.go:835: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 11.002919417s
addons_test.go:839: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable headlamp --alsologtostderr -v=1
addons_test.go:839: (dbg) Done: out/minikube-darwin-amd64 -p addons-937000 addons disable headlamp --alsologtostderr -v=1: (5.462301709s)
--- PASS: TestAddons/parallel/Headlamp (17.46s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.37s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:867: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:344: "cloud-spanner-emulator-5455fb9b69-fttkn" [7f16faa1-fa9f-4cfb-ac45-09ee515bf336] Running
addons_test.go:867: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.002285563s
addons_test.go:870: (dbg) Run:  out/minikube-darwin-amd64 addons disable cloud-spanner -p addons-937000
--- PASS: TestAddons/parallel/CloudSpanner (5.37s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (55.44s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:982: (dbg) Run:  kubectl --context addons-937000 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:988: (dbg) Run:  kubectl --context addons-937000 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:992: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-937000 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:995: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:344: "test-local-path" [94719439-92aa-464c-b3ec-398d0d799906] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "test-local-path" [94719439-92aa-464c-b3ec-398d0d799906] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "test-local-path" [94719439-92aa-464c-b3ec-398d0d799906] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:995: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 6.004884384s
addons_test.go:1000: (dbg) Run:  kubectl --context addons-937000 get pvc test-pvc -o=json
addons_test.go:1009: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 ssh "cat /opt/local-path-provisioner/pvc-ad2e94d6-5ec2-4576-a019-81ba4cfa8223_default_test-pvc/file1"
addons_test.go:1021: (dbg) Run:  kubectl --context addons-937000 delete pod test-local-path
addons_test.go:1025: (dbg) Run:  kubectl --context addons-937000 delete pvc test-pvc
addons_test.go:1029: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1029: (dbg) Done: out/minikube-darwin-amd64 -p addons-937000 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (42.806406292s)
--- PASS: TestAddons/parallel/LocalPath (55.44s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.32s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1061: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:344: "nvidia-device-plugin-daemonset-nw7t9" [f68dd15d-6437-49ca-9576-a598ee54c761] Running
addons_test.go:1061: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.003051504s
addons_test.go:1064: (dbg) Run:  out/minikube-darwin-amd64 addons disable nvidia-device-plugin -p addons-937000
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.32s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (10.45s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1072: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:344: "yakd-dashboard-799879c74f-cwmbp" [f887f035-8e53-4da6-9c2a-3db143906048] Running
addons_test.go:1072: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 5.002282403s
addons_test.go:1076: (dbg) Run:  out/minikube-darwin-amd64 -p addons-937000 addons disable yakd --alsologtostderr -v=1
addons_test.go:1076: (dbg) Done: out/minikube-darwin-amd64 -p addons-937000 addons disable yakd --alsologtostderr -v=1: (5.443635905s)
--- PASS: TestAddons/parallel/Yakd (10.45s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (5.91s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-937000
addons_test.go:174: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-937000: (5.370056769s)
addons_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-937000
addons_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-937000
addons_test.go:187: (dbg) Run:  out/minikube-darwin-amd64 addons disable gvisor -p addons-937000
--- PASS: TestAddons/StoppedEnableDisable (5.91s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (15.4s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (15.40s)

                                                
                                    
x
+
TestErrorSpam/setup (37.13s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-399000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 --driver=hyperkit 
error_spam_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-399000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 --driver=hyperkit : (37.132834976s)
--- PASS: TestErrorSpam/setup (37.13s)

                                                
                                    
x
+
TestErrorSpam/start (1.53s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 start --dry-run
--- PASS: TestErrorSpam/start (1.53s)

                                                
                                    
x
+
TestErrorSpam/status (0.48s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 status
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 status
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 status
--- PASS: TestErrorSpam/status (0.48s)

                                                
                                    
x
+
TestErrorSpam/pause (1.35s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 pause
--- PASS: TestErrorSpam/pause (1.35s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.34s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 unpause
--- PASS: TestErrorSpam/unpause (1.34s)

                                                
                                    
x
+
TestErrorSpam/stop (153.83s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 stop: (3.3900159s)
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 stop: (1m15.22735735s)
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 stop
E0731 09:48:27.153709    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:27.162686    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:27.174919    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:27.195147    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:27.235997    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:27.317159    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:27.479350    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:27.800927    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:28.441211    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:29.722523    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:32.284769    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:37.405749    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:48:47.645982    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:49:08.128101    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
error_spam_test.go:182: (dbg) Done: out/minikube-darwin-amd64 -p nospam-399000 --log_dir /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/nospam-399000 stop: (1m15.212205125s)
--- PASS: TestErrorSpam/stop (153.83s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1851: local sync path: /Users/jenkins/minikube-integration/19349-1046/.minikube/files/etc/test/nested/copy/1591/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (52.65s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2230: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-680000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
E0731 09:49:49.089585    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
functional_test.go:2230: (dbg) Done: out/minikube-darwin-amd64 start -p functional-680000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (52.649246873s)
--- PASS: TestFunctional/serial/StartWithProxy (52.65s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (36.13s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:655: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-680000 --alsologtostderr -v=8
functional_test.go:655: (dbg) Done: out/minikube-darwin-amd64 start -p functional-680000 --alsologtostderr -v=8: (36.124796483s)
functional_test.go:659: soft start took 36.125278716s for "functional-680000" cluster.
--- PASS: TestFunctional/serial/SoftStart (36.13s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:677: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.04s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:692: (dbg) Run:  kubectl --context functional-680000 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.01s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cache add registry.k8s.io/pause:3.1
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-680000 cache add registry.k8s.io/pause:3.1: (1.128341441s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cache add registry.k8s.io/pause:3.3
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cache add registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.01s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.31s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1073: (dbg) Run:  docker build -t minikube-local-cache-test:functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialCacheCmdcacheadd_local1310878287/001
functional_test.go:1085: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cache add minikube-local-cache-test:functional-680000
functional_test.go:1090: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cache delete minikube-local-cache-test:functional-680000
functional_test.go:1079: (dbg) Run:  docker rmi minikube-local-cache-test:functional-680000
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.31s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1098: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1106: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.18s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1120: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.18s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.01s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1143: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh sudo docker rmi registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (143.062817ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1154: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cache reload
functional_test.go:1159: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.01s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.17s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.17s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (1.17s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:712: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 kubectl -- --context functional-680000 get pods
functional_test.go:712: (dbg) Done: out/minikube-darwin-amd64 -p functional-680000 kubectl -- --context functional-680000 get pods: (1.167722855s)
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (1.17s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (1.45s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:737: (dbg) Run:  out/kubectl --context functional-680000 get pods
functional_test.go:737: (dbg) Done: out/kubectl --context functional-680000 get pods: (1.451837515s)
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (1.45s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (39.55s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:753: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-680000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E0731 09:51:11.009665    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
functional_test.go:753: (dbg) Done: out/minikube-darwin-amd64 start -p functional-680000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (39.552922221s)
functional_test.go:757: restart took 39.553064085s for "functional-680000" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (39.55s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:806: (dbg) Run:  kubectl --context functional-680000 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:821: etcd phase: Running
functional_test.go:831: etcd status: Ready
functional_test.go:821: kube-apiserver phase: Running
functional_test.go:831: kube-apiserver status: Ready
functional_test.go:821: kube-controller-manager phase: Running
functional_test.go:831: kube-controller-manager status: Ready
functional_test.go:821: kube-scheduler phase: Running
functional_test.go:831: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.79s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1232: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 logs
functional_test.go:1232: (dbg) Done: out/minikube-darwin-amd64 -p functional-680000 logs: (2.788397656s)
--- PASS: TestFunctional/serial/LogsCmd (2.79s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.81s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1246: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 logs --file /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialLogsFileCmd3666318043/001/logs.txt
functional_test.go:1246: (dbg) Done: out/minikube-darwin-amd64 -p functional-680000 logs --file /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalserialLogsFileCmd3666318043/001/logs.txt: (2.805197056s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.81s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (3.67s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2317: (dbg) Run:  kubectl --context functional-680000 apply -f testdata/invalidsvc.yaml
functional_test.go:2331: (dbg) Run:  out/minikube-darwin-amd64 service invalid-svc -p functional-680000
functional_test.go:2331: (dbg) Non-zero exit: out/minikube-darwin-amd64 service invalid-svc -p functional-680000: exit status 115 (270.542134ms)

                                                
                                                
-- stdout --
	|-----------|-------------|-------------|--------------------------|
	| NAMESPACE |    NAME     | TARGET PORT |           URL            |
	|-----------|-------------|-------------|--------------------------|
	| default   | invalid-svc |          80 | http://192.169.0.4:32036 |
	|-----------|-------------|-------------|--------------------------|
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                            │
	│    * If the above advice does not help, please let us know:                                                                │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                              │
	│                                                                                                                            │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                   │
	│    * Please also attach the following file to the GitHub issue:                                                            │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log    │
	│                                                                                                                            │
	╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2323: (dbg) Run:  kubectl --context functional-680000 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (3.67s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 config get cpus: exit status 14 (65.948874ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 config set cpus 2
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 config get cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 config get cpus: exit status 14 (54.643541ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.49s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (12.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:901: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-680000 --alsologtostderr -v=1]
functional_test.go:906: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-680000 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to kill pid 2638: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (12.12s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (1.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:970: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-680000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:970: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-680000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (491.243763ms)

                                                
                                                
-- stdout --
	* [functional-680000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:52:26.434664    2585 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:52:26.434904    2585 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:52:26.434910    2585 out.go:304] Setting ErrFile to fd 2...
	I0731 09:52:26.434914    2585 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:52:26.435079    2585 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:52:26.436577    2585 out.go:298] Setting JSON to false
	I0731 09:52:26.459305    2585 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1316,"bootTime":1722443430,"procs":496,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:52:26.459397    2585 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:52:26.480966    2585 out.go:177] * [functional-680000] minikube v1.33.1 on Darwin 14.5
	I0731 09:52:26.522626    2585 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 09:52:26.522662    2585 notify.go:220] Checking for updates...
	I0731 09:52:26.564470    2585 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:52:26.585741    2585 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:52:26.606666    2585 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:52:26.627555    2585 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:52:26.648727    2585 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 09:52:26.670398    2585 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:52:26.671036    2585 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:52:26.671117    2585 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:52:26.681687    2585 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50625
	I0731 09:52:26.682056    2585 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:52:26.682472    2585 main.go:141] libmachine: Using API Version  1
	I0731 09:52:26.682486    2585 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:52:26.682715    2585 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:52:26.682838    2585 main.go:141] libmachine: (functional-680000) Calling .DriverName
	I0731 09:52:26.683035    2585 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:52:26.683296    2585 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:52:26.683330    2585 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:52:26.691851    2585 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50627
	I0731 09:52:26.692191    2585 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:52:26.692579    2585 main.go:141] libmachine: Using API Version  1
	I0731 09:52:26.692597    2585 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:52:26.692806    2585 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:52:26.692919    2585 main.go:141] libmachine: (functional-680000) Calling .DriverName
	I0731 09:52:26.721701    2585 out.go:177] * Using the hyperkit driver based on existing profile
	I0731 09:52:26.763559    2585 start.go:297] selected driver: hyperkit
	I0731 09:52:26.763589    2585 start.go:901] validating driver "hyperkit" against &{Name:functional-680000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfi
g:{KubernetesVersion:v1.30.3 ClusterName:functional-680000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.4 Port:8441 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:2628
0h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:52:26.763796    2585 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 09:52:26.788583    2585 out.go:177] 
	W0731 09:52:26.809665    2585 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0731 09:52:26.830690    2585 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:987: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-680000 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (1.20s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.55s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1016: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-680000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1016: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-680000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (547.672178ms)

                                                
                                                
-- stdout --
	* [functional-680000] minikube v1.33.1 sur Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 09:52:25.878870    2578 out.go:291] Setting OutFile to fd 1 ...
	I0731 09:52:25.879137    2578 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:52:25.879143    2578 out.go:304] Setting ErrFile to fd 2...
	I0731 09:52:25.879147    2578 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 09:52:25.879342    2578 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 09:52:25.880932    2578 out.go:298] Setting JSON to false
	I0731 09:52:25.903580    2578 start.go:129] hostinfo: {"hostname":"MacOS-Agent-4.local","uptime":1315,"bootTime":1722443430,"procs":494,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"f2f27e25-cfda-5ffd-9706-e98286194e62"}
	W0731 09:52:25.903667    2578 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0731 09:52:25.928567    2578 out.go:177] * [functional-680000] minikube v1.33.1 sur Darwin 14.5
	I0731 09:52:25.984279    2578 notify.go:220] Checking for updates...
	I0731 09:52:26.022140    2578 out.go:177]   - MINIKUBE_LOCATION=19349
	I0731 09:52:26.042949    2578 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	I0731 09:52:26.064098    2578 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0731 09:52:26.084993    2578 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0731 09:52:26.106003    2578 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	I0731 09:52:26.127082    2578 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0731 09:52:26.148400    2578 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 09:52:26.148865    2578 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:52:26.148910    2578 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:52:26.157754    2578 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50620
	I0731 09:52:26.158092    2578 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:52:26.158496    2578 main.go:141] libmachine: Using API Version  1
	I0731 09:52:26.158512    2578 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:52:26.158722    2578 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:52:26.158831    2578 main.go:141] libmachine: (functional-680000) Calling .DriverName
	I0731 09:52:26.159019    2578 driver.go:392] Setting default libvirt URI to qemu:///system
	I0731 09:52:26.159276    2578 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 09:52:26.159312    2578 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 09:52:26.167551    2578 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50622
	I0731 09:52:26.167892    2578 main.go:141] libmachine: () Calling .GetVersion
	I0731 09:52:26.168251    2578 main.go:141] libmachine: Using API Version  1
	I0731 09:52:26.168267    2578 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 09:52:26.168448    2578 main.go:141] libmachine: () Calling .GetMachineName
	I0731 09:52:26.168558    2578 main.go:141] libmachine: (functional-680000) Calling .DriverName
	I0731 09:52:26.197037    2578 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I0731 09:52:26.255245    2578 start.go:297] selected driver: hyperkit
	I0731 09:52:26.255274    2578 start.go:901] validating driver "hyperkit" against &{Name:functional-680000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19339/minikube-v1.33.1-1722248113-19339-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721902582-19326@sha256:540fb5dc7f38be17ff5276a38dfe6c8a4b1d9ba1c27c62244e6eebd7e37696e7 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfi
g:{KubernetesVersion:v1.30.3 ClusterName:functional-680000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.4 Port:8441 KubernetesVersion:v1.30.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:2628
0h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0731 09:52:26.255479    2578 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0731 09:52:26.280987    2578 out.go:177] 
	W0731 09:52:26.302301    2578 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0731 09:52:26.322935    2578 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.55s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:850: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 status
functional_test.go:856: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:868: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (12.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1625: (dbg) Run:  kubectl --context functional-680000 create deployment hello-node-connect --image=registry.k8s.io/echoserver:1.8
functional_test.go:1631: (dbg) Run:  kubectl --context functional-680000 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:344: "hello-node-connect-57b4589c47-d9wzq" [62ed55ce-15b6-4c1c-94d0-8e099eabcf62] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-connect-57b4589c47-d9wzq" [62ed55ce-15b6-4c1c-94d0-8e099eabcf62] Running
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 12.003665557s
functional_test.go:1645: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 service hello-node-connect --url
functional_test.go:1651: found endpoint for hello-node-connect: http://192.169.0.4:31882
functional_test.go:1671: http://192.169.0.4:31882: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-57b4589c47-d9wzq

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=10.244.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.169.0.4:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.169.0.4:31882
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (12.36s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1686: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 addons list
functional_test.go:1698: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.22s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (27.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:344: "storage-provisioner" [076e633b-0039-4a03-9b5f-cb69139df2ae] Running
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 6.005081743s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-680000 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-680000 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-680000 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-680000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [9cc70d39-e9b6-460f-8a48-446824384b39] Pending
helpers_test.go:344: "sp-pod" [9cc70d39-e9b6-460f-8a48-446824384b39] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [9cc70d39-e9b6-460f-8a48-446824384b39] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 13.004918176s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-680000 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-680000 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-680000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [77912cf8-c11c-43a8-8fc9-3004f37af0ff] Pending
helpers_test.go:344: "sp-pod" [77912cf8-c11c-43a8-8fc9-3004f37af0ff] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [77912cf8-c11c-43a8-8fc9-3004f37af0ff] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 7.005022441s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-680000 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (27.50s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1721: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "echo hello"
functional_test.go:1738: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.28s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (0.9s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh -n functional-680000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cp functional-680000:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelCpCmd1890558517/001/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh -n functional-680000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh -n functional-680000 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (0.90s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (24.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1789: (dbg) Run:  kubectl --context functional-680000 replace --force -f testdata/mysql.yaml
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:344: "mysql-64454c8b5c-fltd6" [b7b65e07-ca00-4864-bb1b-fbc164267f26] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])
helpers_test.go:344: "mysql-64454c8b5c-fltd6" [b7b65e07-ca00-4864-bb1b-fbc164267f26] Running
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 21.004693271s
functional_test.go:1803: (dbg) Run:  kubectl --context functional-680000 exec mysql-64454c8b5c-fltd6 -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-680000 exec mysql-64454c8b5c-fltd6 -- mysql -ppassword -e "show databases;": exit status 1 (103.716092ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-680000 exec mysql-64454c8b5c-fltd6 -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-680000 exec mysql-64454c8b5c-fltd6 -- mysql -ppassword -e "show databases;": exit status 1 (120.426893ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-680000 exec mysql-64454c8b5c-fltd6 -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (24.23s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1925: Checking for existence of /etc/test/nested/copy/1591/hosts within VM
functional_test.go:1927: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo cat /etc/test/nested/copy/1591/hosts"
functional_test.go:1932: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (0.91s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1968: Checking for existence of /etc/ssl/certs/1591.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo cat /etc/ssl/certs/1591.pem"
functional_test.go:1968: Checking for existence of /usr/share/ca-certificates/1591.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo cat /usr/share/ca-certificates/1591.pem"
functional_test.go:1968: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/15912.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo cat /etc/ssl/certs/15912.pem"
functional_test.go:1995: Checking for existence of /usr/share/ca-certificates/15912.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo cat /usr/share/ca-certificates/15912.pem"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (0.91s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:218: (dbg) Run:  kubectl --context functional-680000 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2023: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo systemctl is-active crio"
functional_test.go:2023: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "sudo systemctl is-active crio": exit status 1 (171.049588ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2284: (dbg) Run:  out/minikube-darwin-amd64 license
--- PASS: TestFunctional/parallel/License (0.49s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-680000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-680000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-680000 tunnel --alsologtostderr] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-680000 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to kill pid 2408: os: process already finished
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.36s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-680000 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-680000 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:344: "nginx-svc" [f8218ed7-256a-4285-a98b-5d41935941bb] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx-svc" [f8218ed7-256a-4285-a98b-5d41935941bb] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 10.003659706s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.14s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-680000 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.111.102.188 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.04s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:319: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:327: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.04s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:351: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:359: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:424: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-darwin-amd64 -p functional-680000 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (6.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1435: (dbg) Run:  kubectl --context functional-680000 create deployment hello-node --image=registry.k8s.io/echoserver:1.8
functional_test.go:1441: (dbg) Run:  kubectl --context functional-680000 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:344: "hello-node-6d85cfcfd8-f8bjr" [ae2154a2-e2a2-4b17-b66e-6e3dbf64fb37] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-6d85cfcfd8-f8bjr" [ae2154a2-e2a2-4b17-b66e-6e3dbf64fb37] Running
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 6.00321825s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (6.11s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1266: (dbg) Run:  out/minikube-darwin-amd64 profile lis
functional_test.go:1271: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1306: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1311: Took "168.000953ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1320: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1325: Took "78.601225ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1357: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1362: Took "173.248724ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1370: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1375: Took "77.727707ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (5.9s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port566186587/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1722444742689755000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port566186587/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1722444742689755000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port566186587/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1722444742689755000" to /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port566186587/001/test-1722444742689755000
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (117.688479ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Jul 31 16:52 created-by-test
-rw-r--r-- 1 docker docker 24 Jul 31 16:52 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Jul 31 16:52 test-1722444742689755000
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh cat /mount-9p/test-1722444742689755000
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-680000 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:344: "busybox-mount" [ff45548f-661a-4f33-8567-5204b01b5819] Pending
helpers_test.go:344: "busybox-mount" [ff45548f-661a-4f33-8567-5204b01b5819] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:344: "busybox-mount" [ff45548f-661a-4f33-8567-5204b01b5819] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "busybox-mount" [ff45548f-661a-4f33-8567-5204b01b5819] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 4.002644556s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-680000 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdany-port566186587/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (5.90s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1455: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1485: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 service list -o json
functional_test.go:1490: Took "365.544842ms" to run "out/minikube-darwin-amd64 -p functional-680000 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1505: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 service --namespace=default --https --url hello-node
functional_test.go:1518: found endpoint: https://192.169.0.4:31912
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1536: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1555: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 service hello-node --url
functional_test.go:1561: found endpoint for hello-node: http://192.169.0.4:31912
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2252: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 version --short
--- PASS: TestFunctional/parallel/Version/short (0.10s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2266: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls --format short --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-680000 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.9
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.30.3
registry.k8s.io/kube-proxy:v1.30.3
registry.k8s.io/kube-controller-manager:v1.30.3
registry.k8s.io/kube-apiserver:v1.30.3
registry.k8s.io/etcd:3.5.12-0
registry.k8s.io/echoserver:1.8
registry.k8s.io/coredns/coredns:v1.11.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/minikube-local-cache-test:functional-680000
docker.io/kubernetesui/metrics-scraper:<none>
docker.io/kubernetesui/dashboard:<none>
docker.io/kicbase/echo-server:functional-680000
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-680000 image ls --format short --alsologtostderr:
I0731 09:52:48.533840    2905 out.go:291] Setting OutFile to fd 1 ...
I0731 09:52:48.534107    2905 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:48.534112    2905 out.go:304] Setting ErrFile to fd 2...
I0731 09:52:48.534116    2905 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:48.534285    2905 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
I0731 09:52:48.534873    2905 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:48.534965    2905 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:48.535309    2905 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:48.535353    2905 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:48.543491    2905 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50989
I0731 09:52:48.543885    2905 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:48.544298    2905 main.go:141] libmachine: Using API Version  1
I0731 09:52:48.544308    2905 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:48.544567    2905 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:48.544704    2905 main.go:141] libmachine: (functional-680000) Calling .GetState
I0731 09:52:48.544800    2905 main.go:141] libmachine: (functional-680000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0731 09:52:48.544862    2905 main.go:141] libmachine: (functional-680000) DBG | hyperkit pid from json: 2174
I0731 09:52:48.546219    2905 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:48.546254    2905 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:48.554434    2905 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50991
I0731 09:52:48.554778    2905 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:48.555100    2905 main.go:141] libmachine: Using API Version  1
I0731 09:52:48.555116    2905 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:48.555387    2905 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:48.555515    2905 main.go:141] libmachine: (functional-680000) Calling .DriverName
I0731 09:52:48.555685    2905 ssh_runner.go:195] Run: systemctl --version
I0731 09:52:48.555702    2905 main.go:141] libmachine: (functional-680000) Calling .GetSSHHostname
I0731 09:52:48.555788    2905 main.go:141] libmachine: (functional-680000) Calling .GetSSHPort
I0731 09:52:48.555863    2905 main.go:141] libmachine: (functional-680000) Calling .GetSSHKeyPath
I0731 09:52:48.555939    2905 main.go:141] libmachine: (functional-680000) Calling .GetSSHUsername
I0731 09:52:48.556022    2905 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/functional-680000/id_rsa Username:docker}
I0731 09:52:48.585050    2905 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0731 09:52:48.605949    2905 main.go:141] libmachine: Making call to close driver server
I0731 09:52:48.605959    2905 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:48.606114    2905 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:48.606123    2905 main.go:141] libmachine: Making call to close connection to plugin binary
I0731 09:52:48.606128    2905 main.go:141] libmachine: Making call to close driver server
I0731 09:52:48.606133    2905 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:48.606137    2905 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
I0731 09:52:48.606258    2905 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
I0731 09:52:48.606332    2905 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:48.606358    2905 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls --format table --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-680000 image ls --format table --alsologtostderr:
|---------------------------------------------|-------------------|---------------|--------|
|                    Image                    |        Tag        |   Image ID    |  Size  |
|---------------------------------------------|-------------------|---------------|--------|
| registry.k8s.io/kube-controller-manager     | v1.30.3           | 76932a3b37d7e | 111MB  |
| docker.io/library/nginx                     | latest            | a72860cb95fd5 | 188MB  |
| gcr.io/k8s-minikube/storage-provisioner     | v5                | 6e38f40d628db | 31.5MB |
| registry.k8s.io/kube-apiserver              | v1.30.3           | 1f6d574d502f3 | 117MB  |
| registry.k8s.io/kube-scheduler              | v1.30.3           | 3edc18e7b7672 | 62MB   |
| docker.io/library/nginx                     | alpine            | 1ae23480369fa | 43.2MB |
| registry.k8s.io/pause                       | 3.1               | da86e6ba6ca19 | 742kB  |
| docker.io/library/minikube-local-cache-test | functional-680000 | d46e6aabd0193 | 30B    |
| registry.k8s.io/kube-proxy                  | v1.30.3           | 55bb025d2cfa5 | 84.7MB |
| registry.k8s.io/coredns/coredns             | v1.11.1           | cbb01a7bd410d | 59.8MB |
| registry.k8s.io/pause                       | 3.9               | e6f1816883972 | 744kB  |
| docker.io/kubernetesui/dashboard            | <none>            | 07655ddf2eebe | 246MB  |
| docker.io/kubernetesui/metrics-scraper      | <none>            | 115053965e86b | 43.8MB |
| registry.k8s.io/pause                       | latest            | 350b164e7ae1d | 240kB  |
| registry.k8s.io/etcd                        | 3.5.12-0          | 3861cfcd7c04c | 149MB  |
| docker.io/kicbase/echo-server               | functional-680000 | 9056ab77afb8e | 4.94MB |
| registry.k8s.io/pause                       | 3.3               | 0184c1613d929 | 683kB  |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc      | 56cc512116c8f | 4.4MB  |
| registry.k8s.io/echoserver                  | 1.8               | 82e4c8a736a4f | 95.4MB |
|---------------------------------------------|-------------------|---------------|--------|
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-680000 image ls --format table --alsologtostderr:
I0731 09:52:48.851889    2913 out.go:291] Setting OutFile to fd 1 ...
I0731 09:52:48.852628    2913 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:48.852639    2913 out.go:304] Setting ErrFile to fd 2...
I0731 09:52:48.852645    2913 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:48.853175    2913 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
I0731 09:52:48.853816    2913 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:48.853909    2913 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:48.854292    2913 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:48.854351    2913 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:48.862900    2913 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50999
I0731 09:52:48.863331    2913 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:48.863764    2913 main.go:141] libmachine: Using API Version  1
I0731 09:52:48.863793    2913 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:48.864020    2913 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:48.864135    2913 main.go:141] libmachine: (functional-680000) Calling .GetState
I0731 09:52:48.864236    2913 main.go:141] libmachine: (functional-680000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0731 09:52:48.864307    2913 main.go:141] libmachine: (functional-680000) DBG | hyperkit pid from json: 2174
I0731 09:52:48.865689    2913 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:48.865717    2913 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:48.874105    2913 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51001
I0731 09:52:48.874464    2913 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:48.874779    2913 main.go:141] libmachine: Using API Version  1
I0731 09:52:48.874800    2913 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:48.875037    2913 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:48.875155    2913 main.go:141] libmachine: (functional-680000) Calling .DriverName
I0731 09:52:48.875318    2913 ssh_runner.go:195] Run: systemctl --version
I0731 09:52:48.875338    2913 main.go:141] libmachine: (functional-680000) Calling .GetSSHHostname
I0731 09:52:48.875416    2913 main.go:141] libmachine: (functional-680000) Calling .GetSSHPort
I0731 09:52:48.875491    2913 main.go:141] libmachine: (functional-680000) Calling .GetSSHKeyPath
I0731 09:52:48.875572    2913 main.go:141] libmachine: (functional-680000) Calling .GetSSHUsername
I0731 09:52:48.875646    2913 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/functional-680000/id_rsa Username:docker}
I0731 09:52:48.905361    2913 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0731 09:52:48.926923    2913 main.go:141] libmachine: Making call to close driver server
I0731 09:52:48.926937    2913 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:48.927088    2913 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:48.927095    2913 main.go:141] libmachine: Making call to close connection to plugin binary
I0731 09:52:48.927103    2913 main.go:141] libmachine: Making call to close driver server
I0731 09:52:48.927108    2913 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:48.927151    2913 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
I0731 09:52:48.927239    2913 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:48.927251    2913 main.go:141] libmachine: Making call to close connection to plugin binary
I0731 09:52:48.927254    2913 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls --format json --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-680000 image ls --format json --alsologtostderr:
[{"id":"1f6d574d502f3b61c851b1bbd4ef2a964ce4c70071dd8da556f2d490d36b095d","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.30.3"],"size":"117000000"},{"id":"76932a3b37d7eb138c8f47c9a2b4218f0466dd273badf856f2ce2f0277e15b5e","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.30.3"],"size":"111000000"},{"id":"3edc18e7b76722eb2eb37a0858c09caacbd422d6e0cae4c2e5ce67bc9a9795e2","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.30.3"],"size":"62000000"},{"id":"a72860cb95fd59e9c696c66441c64f18e66915fa26b249911e83c3854477ed9a","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"188000000"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["registry.k8s.io/echoserver:1.8"],"size":"95400000"},{"id":"d46e6aabd019367ea2dee88f4c95a0ef06c18d719156282833f9cdc577bd55b5","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-680000"],"size":"30"},{"id":"3861cfcd7c04ccac1
f062788eca39487248527ef0c0cfd477a83d7691a75a899","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.12-0"],"size":"149000000"},{"id":"e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.9"],"size":"744000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"55bb025d2cfa592b9381d01e122e72a1ed4b29ca32f86b7d289d99da794784d1","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.30.3"],"size":"84700000"},{"id":"cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.11.1"],"size":"59800000"},{"id":"07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558","repoDigests":[],"repoTags":["docker.io/kubernetesui/dashboard:\u003cnone\u003e"],"size":"246000000"},{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDi
gests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"742000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"240000"},{"id":"1ae23480369fa4139f6dec668d7a5a941b56ea174e9cf75e09771988fe621c95","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"43200000"},{"id":"9056ab77afb8e18e04303f11000a9d31b3f16b74c59475b899ae1b342d328d30","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-680000"],"size":"4940000"},{"id":"115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7","repoDigests":[],"repoTags":["docker.io/kubernetesui/metrics-scraper:\u003cnone\u003e"],"size":"43800000"},{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"siz
e":"683000"}]
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-680000 image ls --format json --alsologtostderr:
I0731 09:52:48.689943    2909 out.go:291] Setting OutFile to fd 1 ...
I0731 09:52:48.690199    2909 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:48.690205    2909 out.go:304] Setting ErrFile to fd 2...
I0731 09:52:48.690208    2909 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:48.690372    2909 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
I0731 09:52:48.690929    2909 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:48.691055    2909 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:48.691412    2909 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:48.691455    2909 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:48.699978    2909 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50994
I0731 09:52:48.700454    2909 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:48.700891    2909 main.go:141] libmachine: Using API Version  1
I0731 09:52:48.700901    2909 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:48.701180    2909 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:48.701349    2909 main.go:141] libmachine: (functional-680000) Calling .GetState
I0731 09:52:48.701450    2909 main.go:141] libmachine: (functional-680000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0731 09:52:48.701525    2909 main.go:141] libmachine: (functional-680000) DBG | hyperkit pid from json: 2174
I0731 09:52:48.703044    2909 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:48.703069    2909 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:48.711567    2909 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50996
I0731 09:52:48.711910    2909 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:48.712242    2909 main.go:141] libmachine: Using API Version  1
I0731 09:52:48.712259    2909 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:48.712489    2909 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:48.712623    2909 main.go:141] libmachine: (functional-680000) Calling .DriverName
I0731 09:52:48.712795    2909 ssh_runner.go:195] Run: systemctl --version
I0731 09:52:48.712813    2909 main.go:141] libmachine: (functional-680000) Calling .GetSSHHostname
I0731 09:52:48.712898    2909 main.go:141] libmachine: (functional-680000) Calling .GetSSHPort
I0731 09:52:48.712986    2909 main.go:141] libmachine: (functional-680000) Calling .GetSSHKeyPath
I0731 09:52:48.713067    2909 main.go:141] libmachine: (functional-680000) Calling .GetSSHUsername
I0731 09:52:48.713152    2909 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/functional-680000/id_rsa Username:docker}
I0731 09:52:48.743261    2909 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0731 09:52:48.770603    2909 main.go:141] libmachine: Making call to close driver server
I0731 09:52:48.770614    2909 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:48.770762    2909 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:48.770774    2909 main.go:141] libmachine: Making call to close connection to plugin binary
I0731 09:52:48.770779    2909 main.go:141] libmachine: Making call to close driver server
I0731 09:52:48.770784    2909 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
I0731 09:52:48.770796    2909 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:48.770925    2909 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:48.770935    2909 main.go:141] libmachine: Making call to close connection to plugin binary
I0731 09:52:48.770934    2909 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls --format yaml --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-680000 image ls --format yaml --alsologtostderr:
- id: 76932a3b37d7eb138c8f47c9a2b4218f0466dd273badf856f2ce2f0277e15b5e
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.30.3
size: "111000000"
- id: 55bb025d2cfa592b9381d01e122e72a1ed4b29ca32f86b7d289d99da794784d1
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.30.3
size: "84700000"
- id: cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.11.1
size: "59800000"
- id: e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.9
size: "744000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "683000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: 1f6d574d502f3b61c851b1bbd4ef2a964ce4c70071dd8da556f2d490d36b095d
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.30.3
size: "117000000"
- id: 3edc18e7b76722eb2eb37a0858c09caacbd422d6e0cae4c2e5ce67bc9a9795e2
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.30.3
size: "62000000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- registry.k8s.io/echoserver:1.8
size: "95400000"
- id: 9056ab77afb8e18e04303f11000a9d31b3f16b74c59475b899ae1b342d328d30
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-680000
size: "4940000"
- id: d46e6aabd019367ea2dee88f4c95a0ef06c18d719156282833f9cdc577bd55b5
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-680000
size: "30"
- id: 3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.12-0
size: "149000000"
- id: 07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558
repoDigests: []
repoTags:
- docker.io/kubernetesui/dashboard:<none>
size: "246000000"
- id: 115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7
repoDigests: []
repoTags:
- docker.io/kubernetesui/metrics-scraper:<none>
size: "43800000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "742000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "240000"
- id: a72860cb95fd59e9c696c66441c64f18e66915fa26b249911e83c3854477ed9a
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "188000000"
- id: 1ae23480369fa4139f6dec668d7a5a941b56ea174e9cf75e09771988fe621c95
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "43200000"

                                                
                                                
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-680000 image ls --format yaml --alsologtostderr:
I0731 09:52:49.007781    2917 out.go:291] Setting OutFile to fd 1 ...
I0731 09:52:49.008079    2917 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:49.008084    2917 out.go:304] Setting ErrFile to fd 2...
I0731 09:52:49.008088    2917 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:49.008258    2917 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
I0731 09:52:49.008873    2917 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:49.008967    2917 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:49.009306    2917 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:49.009375    2917 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:49.017988    2917 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51004
I0731 09:52:49.018447    2917 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:49.018986    2917 main.go:141] libmachine: Using API Version  1
I0731 09:52:49.019017    2917 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:49.019287    2917 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:49.019456    2917 main.go:141] libmachine: (functional-680000) Calling .GetState
I0731 09:52:49.019570    2917 main.go:141] libmachine: (functional-680000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0731 09:52:49.019651    2917 main.go:141] libmachine: (functional-680000) DBG | hyperkit pid from json: 2174
I0731 09:52:49.021093    2917 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:49.021120    2917 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:49.029980    2917 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51006
I0731 09:52:49.030430    2917 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:49.030909    2917 main.go:141] libmachine: Using API Version  1
I0731 09:52:49.030933    2917 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:49.031228    2917 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:49.031377    2917 main.go:141] libmachine: (functional-680000) Calling .DriverName
I0731 09:52:49.031580    2917 ssh_runner.go:195] Run: systemctl --version
I0731 09:52:49.031610    2917 main.go:141] libmachine: (functional-680000) Calling .GetSSHHostname
I0731 09:52:49.031746    2917 main.go:141] libmachine: (functional-680000) Calling .GetSSHPort
I0731 09:52:49.031866    2917 main.go:141] libmachine: (functional-680000) Calling .GetSSHKeyPath
I0731 09:52:49.032005    2917 main.go:141] libmachine: (functional-680000) Calling .GetSSHUsername
I0731 09:52:49.032117    2917 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/functional-680000/id_rsa Username:docker}
I0731 09:52:49.068274    2917 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0731 09:52:49.097245    2917 main.go:141] libmachine: Making call to close driver server
I0731 09:52:49.097270    2917 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:49.097447    2917 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:49.097459    2917 main.go:141] libmachine: Making call to close connection to plugin binary
I0731 09:52:49.097475    2917 main.go:141] libmachine: Making call to close driver server
I0731 09:52:49.097480    2917 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:49.097515    2917 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
I0731 09:52:49.097632    2917 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:49.097640    2917 main.go:141] libmachine: Making call to close connection to plugin binary
I0731 09:52:49.097634    2917 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (2.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:307: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh pgrep buildkitd
functional_test.go:307: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh pgrep buildkitd: exit status 1 (145.284577ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image build -t localhost/my-image:functional-680000 testdata/build --alsologtostderr
functional_test.go:314: (dbg) Done: out/minikube-darwin-amd64 -p functional-680000 image build -t localhost/my-image:functional-680000 testdata/build --alsologtostderr: (2.374581176s)
functional_test.go:322: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-680000 image build -t localhost/my-image:functional-680000 testdata/build --alsologtostderr:
I0731 09:52:49.326005    2926 out.go:291] Setting OutFile to fd 1 ...
I0731 09:52:49.326367    2926 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:49.326374    2926 out.go:304] Setting ErrFile to fd 2...
I0731 09:52:49.326378    2926 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0731 09:52:49.326564    2926 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
I0731 09:52:49.327162    2926 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:49.328423    2926 config.go:182] Loaded profile config "functional-680000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
I0731 09:52:49.328777    2926 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:49.328813    2926 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:49.337789    2926 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51016
I0731 09:52:49.338236    2926 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:49.338678    2926 main.go:141] libmachine: Using API Version  1
I0731 09:52:49.338688    2926 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:49.338952    2926 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:49.339084    2926 main.go:141] libmachine: (functional-680000) Calling .GetState
I0731 09:52:49.339178    2926 main.go:141] libmachine: (functional-680000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0731 09:52:49.339251    2926 main.go:141] libmachine: (functional-680000) DBG | hyperkit pid from json: 2174
I0731 09:52:49.340718    2926 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0731 09:52:49.340741    2926 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0731 09:52:49.349389    2926 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51018
I0731 09:52:49.349783    2926 main.go:141] libmachine: () Calling .GetVersion
I0731 09:52:49.350157    2926 main.go:141] libmachine: Using API Version  1
I0731 09:52:49.350174    2926 main.go:141] libmachine: () Calling .SetConfigRaw
I0731 09:52:49.350405    2926 main.go:141] libmachine: () Calling .GetMachineName
I0731 09:52:49.350531    2926 main.go:141] libmachine: (functional-680000) Calling .DriverName
I0731 09:52:49.350707    2926 ssh_runner.go:195] Run: systemctl --version
I0731 09:52:49.350725    2926 main.go:141] libmachine: (functional-680000) Calling .GetSSHHostname
I0731 09:52:49.350831    2926 main.go:141] libmachine: (functional-680000) Calling .GetSSHPort
I0731 09:52:49.350918    2926 main.go:141] libmachine: (functional-680000) Calling .GetSSHKeyPath
I0731 09:52:49.351027    2926 main.go:141] libmachine: (functional-680000) Calling .GetSSHUsername
I0731 09:52:49.351112    2926 sshutil.go:53] new ssh client: &{IP:192.169.0.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/functional-680000/id_rsa Username:docker}
I0731 09:52:49.388786    2926 build_images.go:161] Building image from path: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.3052106369.tar
I0731 09:52:49.388883    2926 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I0731 09:52:49.398312    2926 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.3052106369.tar
I0731 09:52:49.402952    2926 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.3052106369.tar: stat -c "%s %y" /var/lib/minikube/build/build.3052106369.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.3052106369.tar': No such file or directory
I0731 09:52:49.402981    2926 ssh_runner.go:362] scp /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.3052106369.tar --> /var/lib/minikube/build/build.3052106369.tar (3072 bytes)
I0731 09:52:49.444639    2926 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.3052106369
I0731 09:52:49.461049    2926 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.3052106369 -xf /var/lib/minikube/build/build.3052106369.tar
I0731 09:52:49.473887    2926 docker.go:360] Building image: /var/lib/minikube/build/build.3052106369
I0731 09:52:49.473964    2926 ssh_runner.go:195] Run: docker build -t localhost/my-image:functional-680000 /var/lib/minikube/build/build.3052106369
#0 building with "default" instance using docker driver

                                                
                                                
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 0.9s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B 0.0s done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b done
#5 sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 770B / 770B done
#5 sha256:62ffc2ed7554e4c6d360bce40bbcf196573dd27c4ce080641a2c59867e732dee 527B / 527B done
#5 sha256:beae173ccac6ad749f76713cf4440fe3d21d1043fe616dfbe30775815d1d0f6a 1.46kB / 1.46kB done
#5 sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 0B / 772.79kB 0.1s
#5 sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 772.79kB / 772.79kB 0.2s done
#5 extracting sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa
#5 extracting sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 0.1s done
#5 DONE 0.4s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.5s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.0s

                                                
                                                
#8 exporting to image
#8 exporting layers 0.0s done
#8 writing image sha256:03d9ad2be16d0249866f5fb2109a19e5cda3fe71d3ed152399662ace074d7831 done
#8 naming to localhost/my-image:functional-680000 done
#8 DONE 0.0s
I0731 09:52:51.588054    2926 ssh_runner.go:235] Completed: docker build -t localhost/my-image:functional-680000 /var/lib/minikube/build/build.3052106369: (2.114094904s)
I0731 09:52:51.588117    2926 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.3052106369
I0731 09:52:51.598193    2926 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.3052106369.tar
I0731 09:52:51.608037    2926 build_images.go:217] Built localhost/my-image:functional-680000 from /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/build.3052106369.tar
I0731 09:52:51.608063    2926 build_images.go:133] succeeded building to: functional-680000
I0731 09:52:51.608069    2926 build_images.go:134] failed building to: 
I0731 09:52:51.608085    2926 main.go:141] libmachine: Making call to close driver server
I0731 09:52:51.608096    2926 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:51.608247    2926 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
I0731 09:52:51.608279    2926 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:51.608288    2926 main.go:141] libmachine: Making call to close connection to plugin binary
I0731 09:52:51.608296    2926 main.go:141] libmachine: Making call to close driver server
I0731 09:52:51.608301    2926 main.go:141] libmachine: (functional-680000) Calling .Close
I0731 09:52:51.608448    2926 main.go:141] libmachine: (functional-680000) DBG | Closing plugin on server side
I0731 09:52:51.608460    2926 main.go:141] libmachine: Successfully made call to close driver server
I0731 09:52:51.608467    2926 main.go:141] libmachine: Making call to close connection to plugin binary
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (2.68s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (1.83s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:341: (dbg) Run:  docker pull docker.io/kicbase/echo-server:1.0
functional_test.go:341: (dbg) Done: docker pull docker.io/kicbase/echo-server:1.0: (1.761976612s)
functional_test.go:346: (dbg) Run:  docker tag docker.io/kicbase/echo-server:1.0 docker.io/kicbase/echo-server:functional-680000
--- PASS: TestFunctional/parallel/ImageCommands/Setup (1.83s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (0.92s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:354: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image load --daemon docker.io/kicbase/echo-server:functional-680000 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (0.92s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.7s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1623262760/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1623262760/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1623262760/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T" /mount1: exit status 1 (160.1079ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T" /mount1: exit status 1 (201.669811ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-darwin-amd64 mount -p functional-680000 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1623262760/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1623262760/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdVerifyCleanup1623262760/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.70s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:364: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image load --daemon docker.io/kicbase/echo-server:functional-680000 --alsologtostderr
functional_test.go:364: (dbg) Done: out/minikube-darwin-amd64 -p functional-680000 image load --daemon docker.io/kicbase/echo-server:functional-680000 --alsologtostderr: (1.262084759s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.41s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.42s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:234: (dbg) Run:  docker pull docker.io/kicbase/echo-server:latest
functional_test.go:239: (dbg) Run:  docker tag docker.io/kicbase/echo-server:latest docker.io/kicbase/echo-server:functional-680000
functional_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image load --daemon docker.io/kicbase/echo-server:functional-680000 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (1.42s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.6s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:495: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-680000 docker-env) && out/minikube-darwin-amd64 status -p functional-680000"
functional_test.go:518: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-680000 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.60s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:379: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image save docker.io/kicbase/echo-server:functional-680000 /Users/jenkins/workspace/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:391: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image rm docker.io/kicbase/echo-server:functional-680000 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:408: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image load /Users/jenkins/workspace/echo-server-save.tar --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (0.52s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:418: (dbg) Run:  docker rmi docker.io/kicbase/echo-server:functional-680000
functional_test.go:423: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 image save --daemon docker.io/kicbase/echo-server:functional-680000 --alsologtostderr
functional_test.go:428: (dbg) Run:  docker image inspect docker.io/kicbase/echo-server:functional-680000
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.35s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.04s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:189: (dbg) Run:  docker rmi -f docker.io/kicbase/echo-server:1.0
functional_test.go:189: (dbg) Run:  docker rmi -f docker.io/kicbase/echo-server:functional-680000
--- PASS: TestFunctional/delete_echo-server_images (0.04s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:197: (dbg) Run:  docker rmi -f localhost/my-image:functional-680000
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:205: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-680000
--- PASS: TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (210.89s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p ha-393000 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=hyperkit 
E0731 09:53:27.152497    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 09:53:54.848858    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
ha_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p ha-393000 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=hyperkit : (3m30.514025266s)
ha_test.go:107: (dbg) Run:  out/minikube-darwin-amd64 -p ha-393000 status -v=7 --alsologtostderr
--- PASS: TestMultiControlPlane/serial/StartCluster (210.89s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (6.24s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-darwin-amd64 kubectl -p ha-393000 -- rollout status deployment/busybox: (3.884719213s)
ha_test.go:140: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-b94zr -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-n8d7h -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-zln22 -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-b94zr -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-n8d7h -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-zln22 -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-b94zr -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-n8d7h -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-zln22 -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (6.24s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.24s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-b94zr -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-b94zr -- sh -c "ping -c 1 192.169.0.1"
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-n8d7h -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-n8d7h -- sh -c "ping -c 1 192.169.0.1"
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-zln22 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-393000 -- exec busybox-fc5497c4f-zln22 -- sh -c "ping -c 1 192.169.0.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.24s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.05s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-393000 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.05s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (0.34s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (0.34s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.27s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.27s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (0.34s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (0.34s)

                                                
                                    
x
+
TestImageBuild/serial/Setup (37.14s)

                                                
                                                
=== RUN   TestImageBuild/serial/Setup
image_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -p image-275000 --driver=hyperkit 
E0731 10:13:17.555969    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 10:13:27.205026    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
image_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -p image-275000 --driver=hyperkit : (37.139096154s)
--- PASS: TestImageBuild/serial/Setup (37.14s)

                                                
                                    
x
+
TestImageBuild/serial/NormalBuild (1.67s)

                                                
                                                
=== RUN   TestImageBuild/serial/NormalBuild
image_test.go:78: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal -p image-275000
image_test.go:78: (dbg) Done: out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal -p image-275000: (1.666482579s)
--- PASS: TestImageBuild/serial/NormalBuild (1.67s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithBuildArg (0.79s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithBuildArg
image_test.go:99: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest --build-opt=build-arg=ENV_A=test_env_str --build-opt=no-cache ./testdata/image-build/test-arg -p image-275000
--- PASS: TestImageBuild/serial/BuildWithBuildArg (0.79s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithDockerIgnore (0.7s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithDockerIgnore
image_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal --build-opt=no-cache -p image-275000
--- PASS: TestImageBuild/serial/BuildWithDockerIgnore (0.70s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithSpecifiedDockerfile (0.59s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithSpecifiedDockerfile
image_test.go:88: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest -f inner/Dockerfile ./testdata/image-build/test-f -p image-275000
--- PASS: TestImageBuild/serial/BuildWithSpecifiedDockerfile (0.59s)

                                                
                                    
x
+
TestJSONOutput/start/Command (459.48s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-614000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
E0731 10:16:54.505566    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 10:18:27.205306    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 10:21:30.267811    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-614000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (7m39.475065496s)
--- PASS: TestJSONOutput/start/Command (459.48s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.48s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-614000 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.48s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.45s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-614000 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.45s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.34s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-614000 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-614000 --output=json --user=testUser: (8.334981711s)
--- PASS: TestJSONOutput/stop/Command (8.34s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.58s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-105000 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-105000 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (362.613831ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"f3ea22fa-4513-4b02-913c-7257864a8903","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-105000] minikube v1.33.1 on Darwin 14.5","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"a1a47a35-2892-492b-bff5-f76a29a81454","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=19349"}}
	{"specversion":"1.0","id":"3c2f7e8e-be03-42ad-b615-a2330a3b7bb3","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig"}}
	{"specversion":"1.0","id":"69e329ca-56ba-461e-81c0-eececa60dddf","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"afcf18dc-3575-46ec-aa8e-bd1bddf68695","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"37ed5768-f67d-493e-9833-3f73d39c3b76","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube"}}
	{"specversion":"1.0","id":"1ec24b3d-37b3-4a11-bc75-61b9e2c7b3de","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"3a79f36e-9eee-4121-804d-b668a309f6bd","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-105000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-105000
--- PASS: TestErrorJSONOutput (0.58s)

                                                
                                    
x
+
TestMainNoArgs (0.08s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.08s)

                                                
                                    
x
+
TestMinikubeProfile (93.1s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-501000 --driver=hyperkit 
E0731 10:21:54.504610    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p first-501000 --driver=hyperkit : (40.864723955s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p second-503000 --driver=hyperkit 
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p second-503000 --driver=hyperkit : (40.955796379s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile first-501000
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile second-503000
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-503000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-503000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p second-503000: (5.265450874s)
helpers_test.go:175: Cleaning up "first-501000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-501000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p first-501000: (5.269171386s)
--- PASS: TestMinikubeProfile (93.10s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (112.69s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-858000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E0731 10:26:54.566541    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
multinode_test.go:96: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-858000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (1m52.442426906s)
multinode_test.go:102: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (112.69s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (5.63s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-858000 -- rollout status deployment/busybox: (4.008861029s)
multinode_test.go:505: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-gz4wg -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-tv8g2 -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-gz4wg -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-tv8g2 -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-gz4wg -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-tv8g2 -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (5.63s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.87s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-gz4wg -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-gz4wg -- sh -c "ping -c 1 192.169.0.1"
multinode_test.go:572: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-tv8g2 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-858000 -- exec busybox-fc5497c4f-tv8g2 -- sh -c "ping -c 1 192.169.0.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.87s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (45.47s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-858000 -v 3 --alsologtostderr
E0731 10:28:27.269786    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
multinode_test.go:121: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-858000 -v 3 --alsologtostderr: (45.153706816s)
multinode_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (45.47s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.05s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-858000 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.05s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.18s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.18s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.33s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status --output json --alsologtostderr
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp testdata/cp-test.txt multinode-858000:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile548201865/001/cp-test_multinode-858000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000:/home/docker/cp-test.txt multinode-858000-m02:/home/docker/cp-test_multinode-858000_multinode-858000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m02 "sudo cat /home/docker/cp-test_multinode-858000_multinode-858000-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000:/home/docker/cp-test.txt multinode-858000-m03:/home/docker/cp-test_multinode-858000_multinode-858000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m03 "sudo cat /home/docker/cp-test_multinode-858000_multinode-858000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp testdata/cp-test.txt multinode-858000-m02:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000-m02:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile548201865/001/cp-test_multinode-858000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000-m02:/home/docker/cp-test.txt multinode-858000:/home/docker/cp-test_multinode-858000-m02_multinode-858000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000 "sudo cat /home/docker/cp-test_multinode-858000-m02_multinode-858000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000-m02:/home/docker/cp-test.txt multinode-858000-m03:/home/docker/cp-test_multinode-858000-m02_multinode-858000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m03 "sudo cat /home/docker/cp-test_multinode-858000-m02_multinode-858000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp testdata/cp-test.txt multinode-858000-m03:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000-m03:/home/docker/cp-test.txt /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestMultiNodeserialCopyFile548201865/001/cp-test_multinode-858000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000-m03:/home/docker/cp-test.txt multinode-858000:/home/docker/cp-test_multinode-858000-m03_multinode-858000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000 "sudo cat /home/docker/cp-test_multinode-858000-m03_multinode-858000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 cp multinode-858000-m03:/home/docker/cp-test.txt multinode-858000-m02:/home/docker/cp-test_multinode-858000-m03_multinode-858000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 ssh -n multinode-858000-m02 "sudo cat /home/docker/cp-test_multinode-858000-m03_multinode-858000-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.33s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.86s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p multinode-858000 node stop m03: (2.339217046s)
multinode_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-858000 status: exit status 7 (263.102703ms)

                                                
                                                
-- stdout --
	multinode-858000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-858000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-858000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-858000 status --alsologtostderr: exit status 7 (253.578352ms)

                                                
                                                
-- stdout --
	multinode-858000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-858000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-858000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:28:39.221535    4686 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:28:39.221711    4686 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:28:39.221716    4686 out.go:304] Setting ErrFile to fd 2...
	I0731 10:28:39.221720    4686 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:28:39.221885    4686 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:28:39.222097    4686 out.go:298] Setting JSON to false
	I0731 10:28:39.222120    4686 mustload.go:65] Loading cluster: multinode-858000
	I0731 10:28:39.222156    4686 notify.go:220] Checking for updates...
	I0731 10:28:39.222411    4686 config.go:182] Loaded profile config "multinode-858000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:28:39.222427    4686 status.go:255] checking status of multinode-858000 ...
	I0731 10:28:39.222797    4686 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:28:39.222864    4686 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:28:39.231931    4686 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53387
	I0731 10:28:39.232262    4686 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:28:39.232672    4686 main.go:141] libmachine: Using API Version  1
	I0731 10:28:39.232682    4686 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:28:39.232875    4686 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:28:39.232976    4686 main.go:141] libmachine: (multinode-858000) Calling .GetState
	I0731 10:28:39.233060    4686 main.go:141] libmachine: (multinode-858000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:28:39.233125    4686 main.go:141] libmachine: (multinode-858000) DBG | hyperkit pid from json: 4389
	I0731 10:28:39.234332    4686 status.go:330] multinode-858000 host status = "Running" (err=<nil>)
	I0731 10:28:39.234350    4686 host.go:66] Checking if "multinode-858000" exists ...
	I0731 10:28:39.234589    4686 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:28:39.234608    4686 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:28:39.243243    4686 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53389
	I0731 10:28:39.243639    4686 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:28:39.243991    4686 main.go:141] libmachine: Using API Version  1
	I0731 10:28:39.244004    4686 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:28:39.244229    4686 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:28:39.244342    4686 main.go:141] libmachine: (multinode-858000) Calling .GetIP
	I0731 10:28:39.244434    4686 host.go:66] Checking if "multinode-858000" exists ...
	I0731 10:28:39.244683    4686 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:28:39.244707    4686 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:28:39.254736    4686 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53391
	I0731 10:28:39.255076    4686 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:28:39.255420    4686 main.go:141] libmachine: Using API Version  1
	I0731 10:28:39.255436    4686 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:28:39.255609    4686 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:28:39.255713    4686 main.go:141] libmachine: (multinode-858000) Calling .DriverName
	I0731 10:28:39.255858    4686 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:28:39.255876    4686 main.go:141] libmachine: (multinode-858000) Calling .GetSSHHostname
	I0731 10:28:39.255943    4686 main.go:141] libmachine: (multinode-858000) Calling .GetSSHPort
	I0731 10:28:39.256051    4686 main.go:141] libmachine: (multinode-858000) Calling .GetSSHKeyPath
	I0731 10:28:39.256133    4686 main.go:141] libmachine: (multinode-858000) Calling .GetSSHUsername
	I0731 10:28:39.256223    4686 sshutil.go:53] new ssh client: &{IP:192.169.0.15 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/multinode-858000/id_rsa Username:docker}
	I0731 10:28:39.292776    4686 ssh_runner.go:195] Run: systemctl --version
	I0731 10:28:39.297614    4686 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:28:39.308769    4686 kubeconfig.go:125] found "multinode-858000" server: "https://192.169.0.15:8443"
	I0731 10:28:39.308794    4686 api_server.go:166] Checking apiserver status ...
	I0731 10:28:39.308833    4686 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0731 10:28:39.320024    4686 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/2001/cgroup
	W0731 10:28:39.327455    4686 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/2001/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0731 10:28:39.327499    4686 ssh_runner.go:195] Run: ls
	I0731 10:28:39.330795    4686 api_server.go:253] Checking apiserver healthz at https://192.169.0.15:8443/healthz ...
	I0731 10:28:39.333748    4686 api_server.go:279] https://192.169.0.15:8443/healthz returned 200:
	ok
	I0731 10:28:39.333759    4686 status.go:422] multinode-858000 apiserver status = Running (err=<nil>)
	I0731 10:28:39.333768    4686 status.go:257] multinode-858000 status: &{Name:multinode-858000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:28:39.333780    4686 status.go:255] checking status of multinode-858000-m02 ...
	I0731 10:28:39.334030    4686 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:28:39.334051    4686 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:28:39.342459    4686 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53395
	I0731 10:28:39.342789    4686 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:28:39.343099    4686 main.go:141] libmachine: Using API Version  1
	I0731 10:28:39.343108    4686 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:28:39.343323    4686 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:28:39.343427    4686 main.go:141] libmachine: (multinode-858000-m02) Calling .GetState
	I0731 10:28:39.343515    4686 main.go:141] libmachine: (multinode-858000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:28:39.343581    4686 main.go:141] libmachine: (multinode-858000-m02) DBG | hyperkit pid from json: 4407
	I0731 10:28:39.344817    4686 status.go:330] multinode-858000-m02 host status = "Running" (err=<nil>)
	I0731 10:28:39.344826    4686 host.go:66] Checking if "multinode-858000-m02" exists ...
	I0731 10:28:39.345071    4686 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:28:39.345092    4686 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:28:39.353343    4686 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53397
	I0731 10:28:39.353678    4686 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:28:39.354055    4686 main.go:141] libmachine: Using API Version  1
	I0731 10:28:39.354070    4686 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:28:39.354258    4686 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:28:39.354364    4686 main.go:141] libmachine: (multinode-858000-m02) Calling .GetIP
	I0731 10:28:39.354454    4686 host.go:66] Checking if "multinode-858000-m02" exists ...
	I0731 10:28:39.354697    4686 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:28:39.354726    4686 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:28:39.362990    4686 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53399
	I0731 10:28:39.363382    4686 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:28:39.363714    4686 main.go:141] libmachine: Using API Version  1
	I0731 10:28:39.363745    4686 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:28:39.363941    4686 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:28:39.364063    4686 main.go:141] libmachine: (multinode-858000-m02) Calling .DriverName
	I0731 10:28:39.364196    4686 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0731 10:28:39.364208    4686 main.go:141] libmachine: (multinode-858000-m02) Calling .GetSSHHostname
	I0731 10:28:39.364284    4686 main.go:141] libmachine: (multinode-858000-m02) Calling .GetSSHPort
	I0731 10:28:39.364379    4686 main.go:141] libmachine: (multinode-858000-m02) Calling .GetSSHKeyPath
	I0731 10:28:39.364477    4686 main.go:141] libmachine: (multinode-858000-m02) Calling .GetSSHUsername
	I0731 10:28:39.364555    4686 sshutil.go:53] new ssh client: &{IP:192.169.0.16 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/19349-1046/.minikube/machines/multinode-858000-m02/id_rsa Username:docker}
	I0731 10:28:39.398339    4686 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0731 10:28:39.408587    4686 status.go:257] multinode-858000-m02 status: &{Name:multinode-858000-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:28:39.408614    4686 status.go:255] checking status of multinode-858000-m03 ...
	I0731 10:28:39.408918    4686 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:28:39.408944    4686 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:28:39.417781    4686 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53402
	I0731 10:28:39.418140    4686 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:28:39.418466    4686 main.go:141] libmachine: Using API Version  1
	I0731 10:28:39.418477    4686 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:28:39.418676    4686 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:28:39.418780    4686 main.go:141] libmachine: (multinode-858000-m03) Calling .GetState
	I0731 10:28:39.418864    4686 main.go:141] libmachine: (multinode-858000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:28:39.418940    4686 main.go:141] libmachine: (multinode-858000-m03) DBG | hyperkit pid from json: 4472
	I0731 10:28:39.420143    4686 main.go:141] libmachine: (multinode-858000-m03) DBG | hyperkit pid 4472 missing from process table
	I0731 10:28:39.420161    4686 status.go:330] multinode-858000-m03 host status = "Stopped" (err=<nil>)
	I0731 10:28:39.420169    4686 status.go:343] host is not running, skipping remaining checks
	I0731 10:28:39.420176    4686 status.go:257] multinode-858000-m03 status: &{Name:multinode-858000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.86s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (41.81s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 node start m03 -v=7 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-darwin-amd64 -p multinode-858000 node start m03 -v=7 --alsologtostderr: (41.441670307s)
multinode_test.go:290: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status -v=7 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (41.81s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (140.28s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-858000
multinode_test.go:321: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-858000
multinode_test.go:321: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-858000: (18.811406794s)
multinode_test.go:326: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-858000 --wait=true -v=8 --alsologtostderr
E0731 10:29:57.622857    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
multinode_test.go:326: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-858000 --wait=true -v=8 --alsologtostderr: (2m1.353214106s)
multinode_test.go:331: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-858000
--- PASS: TestMultiNode/serial/RestartKeepsNodes (140.28s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (3.44s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-darwin-amd64 -p multinode-858000 node delete m03: (3.104234304s)
multinode_test.go:422: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (3.44s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (16.8s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 stop
E0731 10:31:54.571335    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
multinode_test.go:345: (dbg) Done: out/minikube-darwin-amd64 -p multinode-858000 stop: (16.63437595s)
multinode_test.go:351: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-858000 status: exit status 7 (85.197233ms)

                                                
                                                
-- stdout --
	multinode-858000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-858000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-858000 status --alsologtostderr: exit status 7 (77.541922ms)

                                                
                                                
-- stdout --
	multinode-858000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-858000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0731 10:32:01.722481    5158 out.go:291] Setting OutFile to fd 1 ...
	I0731 10:32:01.722652    5158 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:32:01.722658    5158 out.go:304] Setting ErrFile to fd 2...
	I0731 10:32:01.722661    5158 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0731 10:32:01.722830    5158 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/19349-1046/.minikube/bin
	I0731 10:32:01.723012    5158 out.go:298] Setting JSON to false
	I0731 10:32:01.723043    5158 mustload.go:65] Loading cluster: multinode-858000
	I0731 10:32:01.723074    5158 notify.go:220] Checking for updates...
	I0731 10:32:01.723333    5158 config.go:182] Loaded profile config "multinode-858000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.3
	I0731 10:32:01.723349    5158 status.go:255] checking status of multinode-858000 ...
	I0731 10:32:01.723704    5158 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:32:01.723757    5158 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:32:01.732300    5158 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53638
	I0731 10:32:01.732617    5158 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:32:01.733077    5158 main.go:141] libmachine: Using API Version  1
	I0731 10:32:01.733096    5158 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:32:01.733304    5158 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:32:01.733446    5158 main.go:141] libmachine: (multinode-858000) Calling .GetState
	I0731 10:32:01.733532    5158 main.go:141] libmachine: (multinode-858000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:32:01.733603    5158 main.go:141] libmachine: (multinode-858000) DBG | hyperkit pid from json: 4762
	I0731 10:32:01.734560    5158 main.go:141] libmachine: (multinode-858000) DBG | hyperkit pid 4762 missing from process table
	I0731 10:32:01.734590    5158 status.go:330] multinode-858000 host status = "Stopped" (err=<nil>)
	I0731 10:32:01.734596    5158 status.go:343] host is not running, skipping remaining checks
	I0731 10:32:01.734603    5158 status.go:257] multinode-858000 status: &{Name:multinode-858000 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0731 10:32:01.734621    5158 status.go:255] checking status of multinode-858000-m02 ...
	I0731 10:32:01.734865    5158 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0731 10:32:01.734886    5158 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0731 10:32:01.743405    5158 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53640
	I0731 10:32:01.743763    5158 main.go:141] libmachine: () Calling .GetVersion
	I0731 10:32:01.744143    5158 main.go:141] libmachine: Using API Version  1
	I0731 10:32:01.744165    5158 main.go:141] libmachine: () Calling .SetConfigRaw
	I0731 10:32:01.744366    5158 main.go:141] libmachine: () Calling .GetMachineName
	I0731 10:32:01.744489    5158 main.go:141] libmachine: (multinode-858000-m02) Calling .GetState
	I0731 10:32:01.744596    5158 main.go:141] libmachine: (multinode-858000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0731 10:32:01.744651    5158 main.go:141] libmachine: (multinode-858000-m02) DBG | hyperkit pid from json: 5072
	I0731 10:32:01.745589    5158 main.go:141] libmachine: (multinode-858000-m02) DBG | hyperkit pid 5072 missing from process table
	I0731 10:32:01.745620    5158 status.go:330] multinode-858000-m02 host status = "Stopped" (err=<nil>)
	I0731 10:32:01.745628    5158 status.go:343] host is not running, skipping remaining checks
	I0731 10:32:01.745636    5158 status.go:257] multinode-858000-m02 status: &{Name:multinode-858000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (16.80s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (107.06s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-858000 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E0731 10:33:27.272906    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
multinode_test.go:376: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-858000 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (1m46.72717958s)
multinode_test.go:382: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-858000 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (107.06s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (43.98s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-858000
multinode_test.go:464: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-858000-m02 --driver=hyperkit 
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-858000-m02 --driver=hyperkit : exit status 14 (417.602191ms)

                                                
                                                
-- stdout --
	* [multinode-858000-m02] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-858000-m02' is duplicated with machine name 'multinode-858000-m02' in profile 'multinode-858000'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-858000-m03 --driver=hyperkit 
multinode_test.go:472: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-858000-m03 --driver=hyperkit : (37.994384989s)
multinode_test.go:479: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-858000
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-858000: exit status 80 (257.275473ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-858000 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-858000-m03 already exists in multinode-858000-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-858000-m03
multinode_test.go:484: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-858000-m03: (5.253560219s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (43.98s)

                                                
                                    
x
+
TestPreload (277.28s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-503000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4
E0731 10:36:54.576446    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
preload_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-503000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4: (3m4.897761112s)
preload_test.go:52: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-503000 image pull gcr.io/k8s-minikube/busybox
preload_test.go:52: (dbg) Done: out/minikube-darwin-amd64 -p test-preload-503000 image pull gcr.io/k8s-minikube/busybox: (1.216002254s)
preload_test.go:58: (dbg) Run:  out/minikube-darwin-amd64 stop -p test-preload-503000
preload_test.go:58: (dbg) Done: out/minikube-darwin-amd64 stop -p test-preload-503000: (8.382688656s)
preload_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-503000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit 
E0731 10:38:10.341126    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 10:38:27.277584    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
preload_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-503000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit : (1m17.386091373s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-503000 image list
helpers_test.go:175: Cleaning up "test-preload-503000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-503000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-503000: (5.24154283s)
--- PASS: TestPreload (277.28s)

                                                
                                    
x
+
TestSkaffold (113.18s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe2089763205 version
skaffold_test.go:59: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe2089763205 version: (1.712372812s)
skaffold_test.go:63: skaffold version: v2.13.1
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-593000 --memory=2600 --driver=hyperkit 
E0731 10:41:54.579225    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-593000 --memory=2600 --driver=hyperkit : (37.289781539s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:105: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe2089763205 run --minikube-profile skaffold-593000 --kube-context skaffold-593000 --status-check=true --port-forward=false --interactive=false
skaffold_test.go:105: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/skaffold.exe2089763205 run --minikube-profile skaffold-593000 --kube-context skaffold-593000 --status-check=true --port-forward=false --interactive=false: (56.343597073s)
skaffold_test.go:111: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:344: "leeroy-app-845f696677-fj9n5" [50508cb8-cf18-4893-bdd5-7fcfcca33b6d] Running
skaffold_test.go:111: (dbg) TestSkaffold: app=leeroy-app healthy within 6.003160714s
skaffold_test.go:114: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:344: "leeroy-web-c8c56f9bb-58m29" [33cf548b-c310-4f3f-b5b3-4c72d53599b5] Running
E0731 10:43:27.283661    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
skaffold_test.go:114: (dbg) TestSkaffold: app=leeroy-web healthy within 5.004139851s
helpers_test.go:175: Cleaning up "skaffold-593000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-593000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-593000: (5.240770856s)
--- PASS: TestSkaffold (113.18s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (74.2s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.1899114967 start -p running-upgrade-137000 --memory=2200 --vm-driver=hyperkit 
version_upgrade_test.go:120: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.1899114967 start -p running-upgrade-137000 --memory=2200 --vm-driver=hyperkit : (42.227194523s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-137000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:130: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-137000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (25.792137527s)
helpers_test.go:175: Cleaning up "running-upgrade-137000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-137000
E0731 10:48:19.517168    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:19.523082    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:19.535141    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:19.556115    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:19.597609    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:19.677717    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:19.838058    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-137000: (5.236999092s)
--- PASS: TestRunningBinaryUpgrade (74.20s)

                                                
                                    
x
+
TestKubernetesUpgrade (204.08s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-666000 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:222: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-666000 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=hyperkit : (1m50.935901197s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-666000
version_upgrade_test.go:227: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-666000: (2.380153667s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-666000 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-666000 status --format={{.Host}}: exit status 7 (65.8317ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-666000 --memory=2200 --kubernetes-version=v1.31.0-beta.0 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:243: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-666000 --memory=2200 --kubernetes-version=v1.31.0-beta.0 --alsologtostderr -v=1 --driver=hyperkit : (58.277379037s)
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-666000 version --output=json
version_upgrade_test.go:267: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:269: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-666000 --memory=2200 --kubernetes-version=v1.20.0 --driver=hyperkit 
version_upgrade_test.go:269: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-666000 --memory=2200 --kubernetes-version=v1.20.0 --driver=hyperkit : exit status 106 (804.779887ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-666000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.31.0-beta.0 cluster to v1.20.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.20.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-666000
	    minikube start -p kubernetes-upgrade-666000 --kubernetes-version=v1.20.0
	    
	    2) Create a second cluster with Kubernetes 1.20.0, by running:
	    
	    minikube start -p kubernetes-upgrade-6660002 --kubernetes-version=v1.20.0
	    
	    3) Use the existing cluster at version Kubernetes 1.31.0-beta.0, by running:
	    
	    minikube start -p kubernetes-upgrade-666000 --kubernetes-version=v1.31.0-beta.0
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:273: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:275: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-666000 --memory=2200 --kubernetes-version=v1.31.0-beta.0 --alsologtostderr -v=1 --driver=hyperkit 
E0731 10:46:37.637747    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
E0731 10:46:54.583984    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
version_upgrade_test.go:275: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-666000 --memory=2200 --kubernetes-version=v1.31.0-beta.0 --alsologtostderr -v=1 --driver=hyperkit : (26.312029172s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-666000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-666000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-666000: (5.24893809s)
--- PASS: TestKubernetesUpgrade (204.08s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (1.85s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (1.85s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (110.04s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.3528530717 start -p stopped-upgrade-234000 --memory=2200 --vm-driver=hyperkit 
version_upgrade_test.go:183: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.3528530717 start -p stopped-upgrade-234000 --memory=2200 --vm-driver=hyperkit : (52.693974399s)
version_upgrade_test.go:192: (dbg) Run:  /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.3528530717 -p stopped-upgrade-234000 stop
version_upgrade_test.go:192: (dbg) Done: /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/minikube-v1.26.0.3528530717 -p stopped-upgrade-234000 stop: (8.269138223s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-234000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:198: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-234000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (49.061312428s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (110.04s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.7s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-234000
version_upgrade_test.go:206: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-234000: (2.703024915s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.70s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.46s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-782000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
E0731 10:48:20.158849    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-782000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (455.244593ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-782000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=19349
	  - KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/19349-1046/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.46s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (86.19s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-782000 --driver=hyperkit 
E0731 10:48:20.800297    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:22.080492    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:24.642695    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:27.287472    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/addons-937000/client.crt: no such file or directory
E0731 10:48:29.764903    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:48:40.006310    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:49:00.486943    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:49:41.448019    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-782000 --driver=hyperkit : (1m26.024344106s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-782000 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (86.19s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (56.95s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-782000 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-782000 --no-kubernetes --driver=hyperkit : (54.408217601s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-782000 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-782000 status -o json: exit status 2 (147.568772ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-782000","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-782000
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-782000: (2.393151136s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (56.95s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (73.39s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-782000 --no-kubernetes --driver=hyperkit 
E0731 10:51:03.258047    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/skaffold-593000/client.crt: no such file or directory
E0731 10:51:54.476083    1591 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/19349-1046/.minikube/profiles/functional-680000/client.crt: no such file or directory
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-782000 --no-kubernetes --driver=hyperkit : (1m13.386470518s)
--- PASS: TestNoKubernetes/serial/Start (73.39s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-782000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-782000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (124.034347ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.12s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.37s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.37s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.35s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-782000
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-782000: (2.35377747s)
--- PASS: TestNoKubernetes/serial/Stop (2.35s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (75.49s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-782000 --driver=hyperkit 
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-782000 --driver=hyperkit : (1m15.487369664s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (75.49s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (4.08s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.33.1 on darwin
- MINIKUBE_LOCATION=19349
- KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2102176747/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2102176747/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2102176747/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2102176747/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting "minikube" primary control-plane node in "minikube" cluster
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (4.08s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (7.65s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.33.1 on darwin
- MINIKUBE_LOCATION=19349
- KUBECONFIG=/Users/jenkins/minikube-integration/19349-1046/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current3580919124/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current3580919124/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current3580919124/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current3580919124/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting "minikube" primary control-plane node in "minikube" cluster
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (7.65s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.17s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-782000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-782000 "sudo systemctl is-active --quiet service kubelet": exit status 80 (165.330818ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_STATUS: Unable to get control-plane node NoKubernetes-782000 host status: state: docker-machine-driver-hyperkit needs to run with elevated permissions. Please run the following command, then try again: sudo chown root:wheel /Users/jenkins/workspace/testdata/hyperkit-driver-without-version/docker-machine-driver-hyperkit && sudo chmod u+s /Users/jenkins/workspace/testdata/hyperkit-driver-without-version/docker-machine-driver-hyperkit

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.17s)

                                                
                                    

Test skip (21/227)

x
+
TestDownloadOnly/v1.20.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.20.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.20.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.3/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.3/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.30.3/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.3/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.3/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.30.3/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.31.0-beta.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.31.0-beta.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:220: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:500: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with docker false darwin amd64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:546: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (14.53s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port2608951170/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (156.226164ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (156.880008ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (116.938884ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (130.23938ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (119.699615ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (121.513149ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
2024/07/31 09:52:39 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (117.517929ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (146.616098ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:251: skipping: mount did not appear, likely because macOS requires prompt to allow non-code signed binaries to listen on non-localhost port
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 -p functional-680000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-680000 ssh "sudo umount -f /mount-9p": exit status 1 (123.277227ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-darwin-amd64 -p functional-680000 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-680000 /var/folders/0y/_8hvl7v13q38_kkh25vpxkz00000gp/T/TestFunctionalparallelMountCmdspecific-port2608951170/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- SKIP: TestFunctional/parallel/MountCmd/specific-port (14.53s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild/serial/validateImageBuildWithBuildEnv (0s)

                                                
                                                
=== RUN   TestImageBuild/serial/validateImageBuildWithBuildEnv
image_test.go:114: skipping due to https://github.com/kubernetes/minikube/issues/12431
--- SKIP: TestImageBuild/serial/validateImageBuildWithBuildEnv (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestKicStaticIP (0s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:123: only run with docker/podman driver
--- SKIP: TestKicStaticIP (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:284: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
Copied to clipboard